Dec 05 20:11:40 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 20:11:40 crc restorecon[4695]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:40 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:11:41 crc restorecon[4695]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 20:11:41 crc kubenswrapper[4904]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:11:41 crc kubenswrapper[4904]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 20:11:41 crc kubenswrapper[4904]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:11:41 crc kubenswrapper[4904]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:11:41 crc kubenswrapper[4904]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 20:11:41 crc kubenswrapper[4904]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.402019 4904 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.406943 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.406971 4904 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.406980 4904 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.406989 4904 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407009 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407017 4904 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407025 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407032 4904 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407043 4904 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407053 4904 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407063 4904 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407071 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407079 4904 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407087 4904 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407094 4904 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407103 4904 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407110 4904 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407119 4904 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407127 4904 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407134 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407142 4904 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407149 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407157 4904 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407164 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407172 4904 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407180 4904 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407187 4904 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407195 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407202 4904 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407210 4904 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407218 4904 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407225 4904 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407233 4904 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407267 4904 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407274 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407282 4904 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407290 4904 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407298 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407305 4904 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407313 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407322 4904 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407329 4904 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407337 4904 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407345 4904 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407353 4904 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407360 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407372 4904 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407382 4904 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407390 4904 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407401 4904 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407410 4904 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407419 4904 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407427 4904 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407436 4904 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407444 4904 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407452 4904 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407462 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407470 4904 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407478 4904 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407485 4904 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407492 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407500 4904 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407512 4904 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407521 4904 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407530 4904 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407539 4904 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407547 4904 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407556 4904 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407564 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407571 4904 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.407579 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408026 4904 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408051 4904 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408418 4904 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408432 4904 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408444 4904 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408456 4904 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408467 4904 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408478 4904 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408488 4904 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408497 4904 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408506 4904 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408516 4904 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408525 4904 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408534 4904 flags.go:64] FLAG: --cgroup-root="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408543 4904 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408551 4904 flags.go:64] FLAG: --client-ca-file="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408560 4904 flags.go:64] FLAG: --cloud-config="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408569 4904 flags.go:64] FLAG: --cloud-provider="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408578 4904 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408588 4904 flags.go:64] FLAG: --cluster-domain="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408597 4904 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408606 4904 flags.go:64] FLAG: --config-dir="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408614 4904 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408624 4904 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408635 4904 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408643 4904 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408652 4904 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408662 4904 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408672 4904 flags.go:64] FLAG: --contention-profiling="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408682 4904 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408691 4904 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408700 4904 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408709 4904 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408720 4904 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408729 4904 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408738 4904 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408746 4904 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408755 4904 flags.go:64] FLAG: --enable-server="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408764 4904 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408776 4904 flags.go:64] FLAG: --event-burst="100" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408785 4904 flags.go:64] FLAG: --event-qps="50" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408795 4904 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408804 4904 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408813 4904 flags.go:64] FLAG: --eviction-hard="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408829 4904 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408838 4904 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408847 4904 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408856 4904 flags.go:64] FLAG: --eviction-soft="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408865 4904 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408874 4904 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408883 4904 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408891 4904 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408900 4904 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408909 4904 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408918 4904 flags.go:64] FLAG: --feature-gates="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408929 4904 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408938 4904 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408947 4904 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408956 4904 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408966 4904 flags.go:64] FLAG: --healthz-port="10248" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408976 4904 flags.go:64] FLAG: --help="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408984 4904 flags.go:64] FLAG: --hostname-override="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.408993 4904 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409002 4904 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409011 4904 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409020 4904 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409028 4904 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409037 4904 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409046 4904 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409054 4904 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409100 4904 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409111 4904 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409121 4904 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409129 4904 flags.go:64] FLAG: --kube-reserved="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409139 4904 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409148 4904 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409157 4904 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409167 4904 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409176 4904 flags.go:64] FLAG: --lock-file="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409184 4904 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409194 4904 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409202 4904 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409216 4904 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409225 4904 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409234 4904 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409243 4904 flags.go:64] FLAG: --logging-format="text" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409251 4904 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409261 4904 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409269 4904 flags.go:64] FLAG: --manifest-url="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409278 4904 flags.go:64] FLAG: --manifest-url-header="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409289 4904 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409298 4904 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409310 4904 flags.go:64] FLAG: --max-pods="110" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409319 4904 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409328 4904 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409337 4904 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409346 4904 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409355 4904 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409364 4904 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409373 4904 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409392 4904 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409405 4904 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409414 4904 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409424 4904 flags.go:64] FLAG: --pod-cidr="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409433 4904 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409445 4904 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409454 4904 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409463 4904 flags.go:64] FLAG: --pods-per-core="0" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409472 4904 flags.go:64] FLAG: --port="10250" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409481 4904 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409490 4904 flags.go:64] FLAG: --provider-id="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409500 4904 flags.go:64] FLAG: --qos-reserved="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409509 4904 flags.go:64] FLAG: --read-only-port="10255" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409519 4904 flags.go:64] FLAG: --register-node="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409528 4904 flags.go:64] FLAG: --register-schedulable="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409536 4904 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409551 4904 flags.go:64] FLAG: --registry-burst="10" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409560 4904 flags.go:64] FLAG: --registry-qps="5" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409568 4904 flags.go:64] FLAG: --reserved-cpus="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409590 4904 flags.go:64] FLAG: --reserved-memory="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409621 4904 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409631 4904 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409640 4904 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409649 4904 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409658 4904 flags.go:64] FLAG: --runonce="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409668 4904 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409677 4904 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409686 4904 flags.go:64] FLAG: --seccomp-default="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409728 4904 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409737 4904 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409746 4904 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409755 4904 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409765 4904 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409773 4904 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409782 4904 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409791 4904 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409800 4904 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409809 4904 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409818 4904 flags.go:64] FLAG: --system-cgroups="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409827 4904 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409841 4904 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409850 4904 flags.go:64] FLAG: --tls-cert-file="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409859 4904 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409869 4904 flags.go:64] FLAG: --tls-min-version="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409878 4904 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409887 4904 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409901 4904 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409910 4904 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409919 4904 flags.go:64] FLAG: --v="2" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409937 4904 flags.go:64] FLAG: --version="false" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409949 4904 flags.go:64] FLAG: --vmodule="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409959 4904 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.409968 4904 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410192 4904 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410203 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410212 4904 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410220 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410228 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410237 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410244 4904 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410252 4904 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410260 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410268 4904 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410276 4904 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410283 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410291 4904 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410298 4904 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410306 4904 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410317 4904 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410328 4904 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410336 4904 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410344 4904 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410352 4904 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410360 4904 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410368 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410376 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410384 4904 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410391 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410402 4904 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410412 4904 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410421 4904 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410429 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410437 4904 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410444 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410453 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410461 4904 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410469 4904 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410477 4904 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410484 4904 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410493 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410500 4904 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410508 4904 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410516 4904 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410523 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410531 4904 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410539 4904 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410546 4904 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410560 4904 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410568 4904 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410575 4904 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410584 4904 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410594 4904 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410603 4904 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410612 4904 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410621 4904 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410632 4904 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410640 4904 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410649 4904 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410657 4904 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410665 4904 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410676 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410685 4904 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410692 4904 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410700 4904 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410711 4904 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410720 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410729 4904 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410737 4904 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410745 4904 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410755 4904 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410763 4904 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410773 4904 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410781 4904 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.410789 4904 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.410811 4904 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.419910 4904 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.419940 4904 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420104 4904 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420120 4904 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420130 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420139 4904 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420148 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420157 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420167 4904 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420175 4904 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420184 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420192 4904 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420200 4904 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420208 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420217 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420226 4904 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420236 4904 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420244 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420253 4904 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420261 4904 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420269 4904 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420277 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420285 4904 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.420296 4904 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434595 4904 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434614 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434624 4904 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434634 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434644 4904 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434652 4904 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434662 4904 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434677 4904 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434687 4904 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434695 4904 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434703 4904 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434711 4904 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434718 4904 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434726 4904 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434734 4904 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434742 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434750 4904 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434757 4904 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434766 4904 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434774 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434782 4904 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434790 4904 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434797 4904 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434805 4904 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434813 4904 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434824 4904 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434833 4904 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434843 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434852 4904 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434860 4904 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434869 4904 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434877 4904 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434885 4904 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434894 4904 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434901 4904 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434911 4904 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434920 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434928 4904 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434935 4904 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434942 4904 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434951 4904 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434959 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434967 4904 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434975 4904 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434983 4904 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434990 4904 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.434999 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435007 4904 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435014 4904 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.435029 4904 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435293 4904 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435308 4904 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435316 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435324 4904 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435334 4904 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435345 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435354 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435362 4904 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435369 4904 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435377 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435385 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435393 4904 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435400 4904 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435408 4904 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435415 4904 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435423 4904 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435431 4904 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435439 4904 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435446 4904 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435454 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435461 4904 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435469 4904 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435477 4904 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435488 4904 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435498 4904 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435507 4904 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435515 4904 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435524 4904 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435531 4904 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435541 4904 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435548 4904 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435556 4904 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435564 4904 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435572 4904 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435579 4904 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435587 4904 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435595 4904 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435605 4904 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435615 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435625 4904 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435633 4904 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435641 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435649 4904 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435657 4904 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435665 4904 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435672 4904 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435680 4904 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435687 4904 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435695 4904 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435702 4904 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435710 4904 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435717 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435725 4904 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435735 4904 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435744 4904 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435753 4904 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435762 4904 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435769 4904 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435777 4904 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435785 4904 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435792 4904 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435800 4904 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435808 4904 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435815 4904 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435823 4904 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435832 4904 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435841 4904 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435849 4904 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435857 4904 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435864 4904 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.435872 4904 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.435884 4904 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.436572 4904 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.443135 4904 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.443234 4904 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.443792 4904 server.go:997] "Starting client certificate rotation" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.443815 4904 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.444154 4904 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-11 21:26:03.417876359 +0000 UTC Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.444247 4904 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 145h14m21.973632635s for next certificate rotation Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.466278 4904 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.467883 4904 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.487037 4904 log.go:25] "Validated CRI v1 runtime API" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.508321 4904 log.go:25] "Validated CRI v1 image API" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.510011 4904 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.512683 4904 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-20-06-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.512712 4904 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.527343 4904 manager.go:217] Machine: {Timestamp:2025-12-05 20:11:41.525769487 +0000 UTC m=+0.336985626 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559 BootID:54b04eb7-ddef-4ce3-9daa-6d051611390c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:17:42:da Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:17:42:da Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e0:9d:45 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:54:4d:58 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1c:e3:1b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ac:35:3e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:08:b1:2d:46:47 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b6:cc:94:cd:c8:9a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.527635 4904 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.527846 4904 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.528487 4904 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.528676 4904 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.528718 4904 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.529221 4904 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.529242 4904 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.529357 4904 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.529383 4904 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.529569 4904 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.529652 4904 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.530619 4904 kubelet.go:418] "Attempting to sync node with API server" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.530645 4904 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.530670 4904 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.530684 4904 kubelet.go:324] "Adding apiserver pod source" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.530696 4904 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.552257 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.552401 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.552428 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.552583 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.567394 4904 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.567819 4904 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.568605 4904 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569256 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569286 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569297 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569307 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569321 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569330 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569339 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569354 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569363 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569373 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569386 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569397 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.569840 4904 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.570407 4904 server.go:1280] "Started kubelet" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.570922 4904 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.570980 4904 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.570929 4904 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 20:11:41 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.573144 4904 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e6ac5c1ebb9b1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:11:41.570365873 +0000 UTC m=+0.381582002,LastTimestamp:2025-12-05 20:11:41.570365873 +0000 UTC m=+0.381582002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.573796 4904 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.574256 4904 server.go:460] "Adding debug handlers to kubelet server" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.574452 4904 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.574565 4904 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.574700 4904 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.574723 4904 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.574886 4904 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.575400 4904 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 13:19:51.945035154 +0000 UTC Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.575462 4904 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 905h8m10.369576526s for next certificate rotation Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.575446 4904 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.576709 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="200ms" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.576708 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.576821 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.577251 4904 factory.go:55] Registering systemd factory Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.577325 4904 factory.go:221] Registration of the systemd container factory successfully Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.578355 4904 factory.go:153] Registering CRI-O factory Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.578376 4904 factory.go:221] Registration of the crio container factory successfully Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.578438 4904 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.578456 4904 factory.go:103] Registering Raw factory Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.578472 4904 manager.go:1196] Started watching for new ooms in manager Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.579024 4904 manager.go:319] Starting recovery of all containers Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584719 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584775 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584792 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584808 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584819 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584829 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584842 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584853 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584870 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584880 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584894 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584905 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584914 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584930 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.584948 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.585521 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.585657 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.585797 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586366 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586398 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586409 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586420 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586431 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586442 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586452 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586462 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586475 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586487 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586498 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586509 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586519 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586529 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586537 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586546 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586556 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586566 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586577 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586586 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586597 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586609 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586619 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586630 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586641 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586651 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586660 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586671 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586682 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586692 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586730 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586740 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586751 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586761 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586775 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586786 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586797 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586807 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586816 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586826 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586836 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586847 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586858 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586867 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586876 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586884 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586893 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586902 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586912 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586921 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586930 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586941 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586950 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586959 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586968 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586978 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586988 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.586999 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587011 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587020 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587030 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587038 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587051 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587076 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587085 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587094 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587104 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587113 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587122 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587132 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587142 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587151 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587160 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587170 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587180 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587190 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587199 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587208 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587219 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587229 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587238 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587247 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587257 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587266 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587274 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587284 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587298 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587308 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587317 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587327 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587337 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587346 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587356 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587365 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587376 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587385 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587394 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587403 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587414 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587426 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587439 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587451 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587462 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587473 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587481 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587491 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587500 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587509 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587518 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587528 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587536 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587545 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587554 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587563 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587573 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587582 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587590 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587600 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587609 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587618 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587627 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587636 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587645 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587655 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587665 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.587674 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588230 4904 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588252 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588262 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588272 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588282 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588292 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588301 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588310 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588320 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588329 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588337 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588345 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588354 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588364 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588373 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588382 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588391 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588401 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588409 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588418 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588427 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588435 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588445 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588453 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588462 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588470 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588479 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588488 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588497 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588506 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588515 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588526 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588537 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588547 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588557 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588566 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588575 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588584 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588593 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588602 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588611 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588621 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588631 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588640 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588648 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588657 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588667 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588677 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588687 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588696 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588706 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588715 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588725 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588734 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588743 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588752 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588762 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588772 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588781 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588789 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588800 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588809 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588818 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588826 4904 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588836 4904 reconstruct.go:97] "Volume reconstruction finished" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.588842 4904 reconciler.go:26] "Reconciler: start to sync state" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.615658 4904 manager.go:324] Recovery completed Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.625946 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.627848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.627887 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.627898 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.628703 4904 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.628719 4904 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.628746 4904 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.639660 4904 policy_none.go:49] "None policy: Start" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.649563 4904 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.649611 4904 state_mem.go:35] "Initializing new in-memory state store" Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.676424 4904 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.678142 4904 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.679960 4904 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.680001 4904 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.680029 4904 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.680104 4904 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 20:11:41 crc kubenswrapper[4904]: W1205 20:11:41.697759 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.697862 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.706492 4904 manager.go:334] "Starting Device Plugin manager" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.706676 4904 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.706700 4904 server.go:79] "Starting device plugin registration server" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.707184 4904 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.707209 4904 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.707379 4904 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.708361 4904 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.708386 4904 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.713305 4904 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.778032 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="400ms" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.780207 4904 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.780297 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.781388 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.781439 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.781456 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.781943 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.782704 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.782783 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785550 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785701 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785484 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785888 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.785907 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.786092 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.786217 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.786911 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.786940 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.786951 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787117 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787276 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787350 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787753 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787805 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787811 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787827 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787839 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787931 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.787951 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788023 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788051 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788606 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788666 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788689 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788907 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.788928 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.789772 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.789837 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.789847 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.790025 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.790072 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.791008 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.791325 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.791374 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.807619 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.809089 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.809266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.809359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.809457 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:11:41 crc kubenswrapper[4904]: E1205 20:11:41.810028 4904 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.899571 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.899919 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.900392 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.900809 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.901002 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.901234 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.901460 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.901659 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.901910 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.902169 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.902392 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.902603 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.902803 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.903222 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:41 crc kubenswrapper[4904]: I1205 20:11:41.903452 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005053 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005170 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005215 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005245 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005279 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005342 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005387 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005417 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005447 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005477 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005504 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005531 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005560 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005589 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005617 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005771 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005854 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005921 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005961 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005960 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006000 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005984 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005970 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006013 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006025 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006030 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006020 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.005832 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006107 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.006047 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.010296 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.011191 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.011222 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.011234 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.011256 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.011515 4904 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.114476 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.139307 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.141533 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c6dbaeaa66386f945ed871594b411956bc5b4cf15ac6140fc21664a4172d7635 WatchSource:0}: Error finding container c6dbaeaa66386f945ed871594b411956bc5b4cf15ac6140fc21664a4172d7635: Status 404 returned error can't find the container with id c6dbaeaa66386f945ed871594b411956bc5b4cf15ac6140fc21664a4172d7635 Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.147689 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.163223 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.167623 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.168024 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-db04568b81da30e63f6dac7c310ae7baf5dd66e26cc2c3c9625ef6a8d786c5d8 WatchSource:0}: Error finding container db04568b81da30e63f6dac7c310ae7baf5dd66e26cc2c3c9625ef6a8d786c5d8: Status 404 returned error can't find the container with id db04568b81da30e63f6dac7c310ae7baf5dd66e26cc2c3c9625ef6a8d786c5d8 Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.179207 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="800ms" Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.181654 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-76cb1a8b5bb09bd84f82f8982797034b56daf30b63b79d856ada85e059539ec1 WatchSource:0}: Error finding container 76cb1a8b5bb09bd84f82f8982797034b56daf30b63b79d856ada85e059539ec1: Status 404 returned error can't find the container with id 76cb1a8b5bb09bd84f82f8982797034b56daf30b63b79d856ada85e059539ec1 Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.199204 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-aed7294c14bbef5707da9905d421efb09c8787fdbc37efceb8af96445305fe97 WatchSource:0}: Error finding container aed7294c14bbef5707da9905d421efb09c8787fdbc37efceb8af96445305fe97: Status 404 returned error can't find the container with id aed7294c14bbef5707da9905d421efb09c8787fdbc37efceb8af96445305fe97 Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.411587 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.413700 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.413749 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.413765 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.413796 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.414246 4904 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.468661 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.468780 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.572349 4904 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.629911 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.630004 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.685713 4904 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7af781963f87b1e5868f7add030bcb0ca2a049a69f84f8c137f90ab266abc504" exitCode=0 Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.685777 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7af781963f87b1e5868f7add030bcb0ca2a049a69f84f8c137f90ab266abc504"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.685954 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db04568b81da30e63f6dac7c310ae7baf5dd66e26cc2c3c9625ef6a8d786c5d8"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.686236 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.687467 4904 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="edab3d954e07cd7d8c0a3ad643b636d5d1382b32f805b370cd79c8462d9f279e" exitCode=0 Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.687498 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"edab3d954e07cd7d8c0a3ad643b636d5d1382b32f805b370cd79c8462d9f279e"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.687536 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"97e6da60798c69924c7efdb9bfc88a39cba8f74783543bab3358c72f1e386e2a"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.687625 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.688387 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.688414 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.688426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.688702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.688764 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.688790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.690706 4904 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f" exitCode=0 Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.690765 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.690785 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6dbaeaa66386f945ed871594b411956bc5b4cf15ac6140fc21664a4172d7635"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.690854 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.692132 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.692196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.692222 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.694126 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.694162 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aed7294c14bbef5707da9905d421efb09c8787fdbc37efceb8af96445305fe97"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.696438 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969" exitCode=0 Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.696486 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.696508 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"76cb1a8b5bb09bd84f82f8982797034b56daf30b63b79d856ada85e059539ec1"} Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.696599 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.713535 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.713580 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.713592 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.715902 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.717129 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.717223 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4904]: I1205 20:11:42.717250 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4904]: W1205 20:11:42.849583 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.849659 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:42 crc kubenswrapper[4904]: E1205 20:11:42.980256 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="1.6s" Dec 05 20:11:43 crc kubenswrapper[4904]: W1205 20:11:43.125344 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:43 crc kubenswrapper[4904]: E1205 20:11:43.125434 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.214891 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.216986 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.217025 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.217037 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.217083 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:11:43 crc kubenswrapper[4904]: E1205 20:11:43.217595 4904 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.573043 4904 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.729421 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.729467 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.729480 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.731508 4904 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e741278dba59b30f47060d3e62e32c8a756299a7b2167ecbc638de8774d19ca2" exitCode=0 Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.731563 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e741278dba59b30f47060d3e62e32c8a756299a7b2167ecbc638de8774d19ca2"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.731685 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.732481 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.732505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.732515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.734734 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"36e445d3f09be60c5ad19cef1b1690e64bafa28df150dc57f95c520d57840e54"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.734884 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.738859 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.738879 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.738888 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.740757 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.740911 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.741165 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.741134 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.742587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.742694 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.742774 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.748259 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.748317 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.748333 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973"} Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.748480 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.749338 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.749387 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4904]: I1205 20:11:43.749404 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.572300 4904 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.755892 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b235f3c19ca851a68486ee1fc60d79c1676b86049bdfe492b290dcc150f1d243"} Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.755951 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899"} Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.756137 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.757934 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.757990 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.758074 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.760461 4904 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1a193fae598e80347b4d71f1bd8f42fcf7763ee192e045cf31d8b01b6d4cd9b2" exitCode=0 Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.760556 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.760554 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1a193fae598e80347b4d71f1bd8f42fcf7763ee192e045cf31d8b01b6d4cd9b2"} Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.760659 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.760584 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.760776 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.761960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.761997 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.761960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.762031 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.762043 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.762015 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.762231 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.762247 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.762258 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.818526 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.819956 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.819996 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.820010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4904]: I1205 20:11:44.820035 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.766364 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fbf237124c42d769fc343d82f37582d14835ea484e9ef3bb521722a48418e851"} Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.766427 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e62d7532fa2241853c93ebe569d5ef934a82ef4701d4e516f20352b2e070bf1a"} Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.766438 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.766448 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7b0b0f8f3e9b05993a25e026a181fdb97624631f47b77184d3ba365597dc5bb"} Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.766464 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"130be3b8744e0533d7aeb82742b24da26471330dad1fddb19c7845b56265bf2c"} Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.766505 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.767134 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.767163 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4904]: I1205 20:11:45.767179 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.168771 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.790233 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00927a167a3c9cf5e409e536afed4af9633cc66a7b92184d2534ef69199550ae"} Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.790308 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.790502 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.791342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.791376 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.791386 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.792197 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.792266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4904]: I1205 20:11:46.792293 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.180675 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.396034 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.793721 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.793797 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.795312 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.795378 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.795423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.795435 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.795392 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4904]: I1205 20:11:47.795534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.089534 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.089883 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.091520 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.091571 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.091586 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.795910 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.796328 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.797331 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.797371 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.797381 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.797416 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.797426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.797383 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.911433 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.911685 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.913241 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.913303 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4904]: I1205 20:11:48.913327 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.374793 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.375270 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.377438 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.377501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.377528 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.381160 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.659279 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.802178 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.803485 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.803545 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4904]: I1205 20:11:50.803557 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.090277 4904 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.090596 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.346436 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.346711 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.348364 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.348410 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.348423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4904]: E1205 20:11:51.713417 4904 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.804365 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.805565 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.805605 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.805619 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4904]: I1205 20:11:51.817190 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:52 crc kubenswrapper[4904]: I1205 20:11:52.806316 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:52 crc kubenswrapper[4904]: I1205 20:11:52.807890 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4904]: I1205 20:11:52.807925 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4904]: I1205 20:11:52.807935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4904]: I1205 20:11:52.810111 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:11:53 crc kubenswrapper[4904]: I1205 20:11:53.810205 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:53 crc kubenswrapper[4904]: I1205 20:11:53.811602 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4904]: I1205 20:11:53.811644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4904]: I1205 20:11:53.811654 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4904]: E1205 20:11:54.581039 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 20:11:54 crc kubenswrapper[4904]: E1205 20:11:54.820994 4904 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 20:11:55 crc kubenswrapper[4904]: W1205 20:11:55.013181 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.013336 4904 trace.go:236] Trace[1097255667]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:11:45.012) (total time: 10001ms): Dec 05 20:11:55 crc kubenswrapper[4904]: Trace[1097255667]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:11:55.013) Dec 05 20:11:55 crc kubenswrapper[4904]: Trace[1097255667]: [10.001188616s] [10.001188616s] END Dec 05 20:11:55 crc kubenswrapper[4904]: E1205 20:11:55.013370 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:11:55 crc kubenswrapper[4904]: W1205 20:11:55.096917 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.097093 4904 trace.go:236] Trace[1232926169]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:11:45.095) (total time: 10001ms): Dec 05 20:11:55 crc kubenswrapper[4904]: Trace[1232926169]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:11:55.096) Dec 05 20:11:55 crc kubenswrapper[4904]: Trace[1232926169]: [10.001321002s] [10.001321002s] END Dec 05 20:11:55 crc kubenswrapper[4904]: E1205 20:11:55.097133 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.121293 4904 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56640->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.121371 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56640->192.168.126.11:17697: read: connection reset by peer" Dec 05 20:11:55 crc kubenswrapper[4904]: W1205 20:11:55.459830 4904 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.459977 4904 trace.go:236] Trace[1289853941]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:11:45.458) (total time: 10001ms): Dec 05 20:11:55 crc kubenswrapper[4904]: Trace[1289853941]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:11:55.459) Dec 05 20:11:55 crc kubenswrapper[4904]: Trace[1289853941]: [10.001332544s] [10.001332544s] END Dec 05 20:11:55 crc kubenswrapper[4904]: E1205 20:11:55.460010 4904 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.573899 4904 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.817872 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.821389 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b235f3c19ca851a68486ee1fc60d79c1676b86049bdfe492b290dcc150f1d243" exitCode=255 Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.821445 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b235f3c19ca851a68486ee1fc60d79c1676b86049bdfe492b290dcc150f1d243"} Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.821687 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.822852 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.822925 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.822948 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.823848 4904 scope.go:117] "RemoveContainer" containerID="b235f3c19ca851a68486ee1fc60d79c1676b86049bdfe492b290dcc150f1d243" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.979754 4904 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.979856 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.990212 4904 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:11:55 crc kubenswrapper[4904]: I1205 20:11:55.990301 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:11:56 crc kubenswrapper[4904]: I1205 20:11:56.826843 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:11:56 crc kubenswrapper[4904]: I1205 20:11:56.829004 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe"} Dec 05 20:11:56 crc kubenswrapper[4904]: I1205 20:11:56.829195 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:56 crc kubenswrapper[4904]: I1205 20:11:56.830231 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4904]: I1205 20:11:56.830272 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4904]: I1205 20:11:56.830286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.202092 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.202291 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.203480 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.203529 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.203542 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.215652 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.402021 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.831717 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.831742 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.831806 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.832799 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.832833 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.832845 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.833696 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.833726 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.833736 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4904]: I1205 20:11:57.838390 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.021412 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.022684 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.022769 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.022790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.022824 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:11:58 crc kubenswrapper[4904]: E1205 20:11:58.026918 4904 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.834097 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.835050 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.835096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4904]: I1205 20:11:58.835106 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4904]: I1205 20:11:59.836974 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:11:59 crc kubenswrapper[4904]: I1205 20:11:59.838213 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4904]: I1205 20:11:59.838271 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4904]: I1205 20:11:59.838287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.184246 4904 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.272892 4904 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.543605 4904 apiserver.go:52] "Watching apiserver" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.548667 4904 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.549290 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.549782 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.550097 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.550260 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:00 crc kubenswrapper[4904]: E1205 20:12:00.550375 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.550425 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.550485 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:00 crc kubenswrapper[4904]: E1205 20:12:00.550671 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.551014 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:00 crc kubenswrapper[4904]: E1205 20:12:00.551190 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.552927 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.553133 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.553482 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.553716 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.553557 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.554129 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.554197 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.554246 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.554662 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.575479 4904 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.591307 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.609561 4904 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.609700 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.619923 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.632036 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.643026 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.652376 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.662576 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.675962 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.994635 4904 trace.go:236] Trace[505590370]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:11:46.130) (total time: 14863ms): Dec 05 20:12:00 crc kubenswrapper[4904]: Trace[505590370]: ---"Objects listed" error: 14863ms (20:12:00.994) Dec 05 20:12:00 crc kubenswrapper[4904]: Trace[505590370]: [14.863928259s] [14.863928259s] END Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.994669 4904 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:12:00 crc kubenswrapper[4904]: I1205 20:12:00.996294 4904 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.090129 4904 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.090202 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097219 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097265 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097290 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097339 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097620 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097673 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097712 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097737 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097698 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097802 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097824 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098513 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097895 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.097891 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098048 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098043 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098241 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098261 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098433 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098657 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098682 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098725 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098748 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098858 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098958 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.098991 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099118 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099140 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099182 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099203 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099271 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099408 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099461 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099490 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099540 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099460 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099497 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099566 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099510 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099633 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099648 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099654 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099669 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099882 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099812 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099842 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099910 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.099989 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100023 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100048 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100091 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100117 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100141 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100165 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100187 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100209 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100232 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100241 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100255 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100260 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100279 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100302 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100326 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100350 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100370 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100393 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100414 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100436 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100460 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100465 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100483 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100506 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100500 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100528 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100555 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100578 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100601 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100622 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100642 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100665 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100700 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100720 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100741 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100762 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100784 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100806 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100830 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100850 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100893 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100913 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100939 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100960 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100983 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101008 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101031 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101088 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101112 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101134 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101157 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101179 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101199 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101221 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101241 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101262 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101283 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101303 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101324 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101346 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101367 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101387 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101430 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101454 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101478 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101497 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101516 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101542 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101565 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101586 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101607 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101627 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101654 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101676 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101696 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101718 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101740 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101761 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101780 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101801 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101824 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101845 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101876 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101898 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101920 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101942 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101965 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101987 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102005 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102028 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102049 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102084 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102105 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102126 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102148 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102168 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102191 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102217 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102238 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102258 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102280 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102301 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102321 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102344 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102367 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102388 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102414 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102437 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102454 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102470 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102487 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102503 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102519 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102536 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102551 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102566 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102582 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102598 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102614 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102630 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102645 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102660 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102679 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102703 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102726 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102749 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102773 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102884 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102914 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102937 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102958 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102980 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103003 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103046 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103091 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103116 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103139 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103161 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103184 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103202 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103221 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103240 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103256 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103273 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103291 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103307 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103324 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103340 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103356 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103372 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103391 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103414 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103440 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103462 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103484 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103508 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103532 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103554 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103577 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103601 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103623 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103645 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103669 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103693 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103717 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103742 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103767 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103791 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103815 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103839 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103867 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103895 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103920 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103970 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104001 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104029 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104055 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104099 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104166 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104197 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104223 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104254 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104282 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104307 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104331 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104356 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104381 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104454 4904 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104470 4904 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104486 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104500 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104513 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104526 4904 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104542 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104555 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104569 4904 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104581 4904 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104595 4904 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104608 4904 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104621 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104636 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104650 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104663 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104677 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104689 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104702 4904 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104714 4904 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104728 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104740 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104753 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104768 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104783 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104798 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.100982 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101012 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101314 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101316 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101435 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.101603 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102415 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102716 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102797 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102965 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102965 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.102977 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103034 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103319 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103378 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103748 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103773 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103828 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103841 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.114307 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.103948 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104044 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104100 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104164 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104143 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104551 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104587 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104669 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104833 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104871 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.104973 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105125 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105089 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105232 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105431 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105269 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105530 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105542 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.105976 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.106103 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.106248 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.106360 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.106637 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.107184 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.107293 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.107492 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.107562 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.107603 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.107635 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.108177 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.108270 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.109010 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.108658 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.110147 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.110161 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.110330 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.111684 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.111707 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.112163 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.112288 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.113875 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.114846 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.115164 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.115205 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.115313 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.115695 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.115805 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.115857 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.116392 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.116504 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.116525 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.116643 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.117086 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.117159 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.117880 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.119587 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.119683 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.119881 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.120103 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.120923 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.123492 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.121380 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.120876 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.120945 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122158 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122219 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122271 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122399 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122470 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122525 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.123638 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122655 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.122865 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.123368 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.124744 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.124803 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125069 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125113 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125307 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125606 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125619 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125785 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125793 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125878 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.126098 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.126028 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.126627 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.126878 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.125668 4904 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.127275 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.127319 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.127445 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128015 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128084 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128106 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128203 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128400 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.128539 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.128817 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:01.628578851 +0000 UTC m=+20.439794960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.128917 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.129001 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.129543 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.130528 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.130697 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.129596 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.131115 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.131447 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.131496 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.136598 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.136607 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.136933 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137083 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.137269 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:01.63111269 +0000 UTC m=+20.442328799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137291 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137357 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137421 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137425 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.131010 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137598 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137706 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137742 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137917 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137914 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.138183 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.137802 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.138573 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.138654 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.138807 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.138885 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139053 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139211 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139269 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139315 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139466 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139474 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139514 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.139651 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:01.639609303 +0000 UTC m=+20.450825412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139671 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.139756 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8v9t9"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.140076 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.140358 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.141134 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.141544 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.141573 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.141933 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.142047 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.142561 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.143020 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.143184 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.145694 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.147624 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.147676 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.147702 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.150041 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.150285 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.152046 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.152796 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.153916 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.154169 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.154484 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.154521 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.154600 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.154789 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:01.654767088 +0000 UTC m=+20.465983197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.156291 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.156330 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.156342 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.156380 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:01.656370622 +0000 UTC m=+20.467586731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.156762 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.158995 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.159145 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.159788 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.166546 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.171742 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.174156 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.174286 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.184431 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.185518 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:01 crc kubenswrapper[4904]: W1205 20:12:01.188964 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-aff4492d9b286cfc37e1b3c5f9c6e6424f667dad175146fd44deb66239a1b31f WatchSource:0}: Error finding container aff4492d9b286cfc37e1b3c5f9c6e6424f667dad175146fd44deb66239a1b31f: Status 404 returned error can't find the container with id aff4492d9b286cfc37e1b3c5f9c6e6424f667dad175146fd44deb66239a1b31f Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.194715 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209374 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209445 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78ec080e-d24e-458b-8622-465dd74773a2-hosts-file\") pod \"node-resolver-8v9t9\" (UID: \"78ec080e-d24e-458b-8622-465dd74773a2\") " pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209472 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/78ec080e-d24e-458b-8622-465dd74773a2-kube-api-access-fj68j\") pod \"node-resolver-8v9t9\" (UID: \"78ec080e-d24e-458b-8622-465dd74773a2\") " pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209490 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209493 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209530 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209543 4904 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209572 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209590 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209599 4904 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209624 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209633 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209643 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209652 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209660 4904 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209670 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209679 4904 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209704 4904 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209714 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209724 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209734 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209746 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209755 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209680 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209779 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209804 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209813 4904 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209822 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209830 4904 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209838 4904 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209846 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209855 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209863 4904 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209871 4904 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209880 4904 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209888 4904 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209896 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209904 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209914 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209922 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209932 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209940 4904 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209948 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209956 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209964 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209972 4904 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209980 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209987 4904 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.209995 4904 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210003 4904 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210010 4904 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210018 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210030 4904 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210038 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210046 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210071 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210084 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210094 4904 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210102 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210111 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210119 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210127 4904 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210135 4904 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210143 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210152 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210160 4904 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210168 4904 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210185 4904 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210194 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210202 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210210 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210218 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210226 4904 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210235 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210243 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210250 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210258 4904 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210267 4904 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210274 4904 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210283 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210292 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210300 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210309 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210317 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210326 4904 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210335 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210343 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210351 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210358 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210366 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210374 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210382 4904 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210389 4904 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210414 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210423 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210432 4904 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210440 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210448 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210459 4904 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210467 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210475 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210483 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210490 4904 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210498 4904 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210506 4904 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210514 4904 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210522 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210530 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210538 4904 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210546 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210555 4904 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210563 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210572 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210580 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210589 4904 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210597 4904 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210606 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210614 4904 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210622 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210630 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210644 4904 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210652 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210660 4904 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210670 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210678 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210687 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210694 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210702 4904 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210710 4904 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210720 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210731 4904 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210740 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210750 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210758 4904 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210767 4904 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210775 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210784 4904 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210792 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210800 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210809 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210817 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210947 4904 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210958 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210967 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210975 4904 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210983 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.210991 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211000 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211008 4904 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211018 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211025 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211034 4904 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211041 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211049 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211131 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211140 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211148 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211156 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211164 4904 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211172 4904 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211180 4904 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211189 4904 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211196 4904 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211204 4904 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211212 4904 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211228 4904 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211236 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211245 4904 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211253 4904 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211261 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211270 4904 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211279 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211287 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211294 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.211850 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.221583 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.232666 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.312294 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78ec080e-d24e-458b-8622-465dd74773a2-hosts-file\") pod \"node-resolver-8v9t9\" (UID: \"78ec080e-d24e-458b-8622-465dd74773a2\") " pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.312416 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/78ec080e-d24e-458b-8622-465dd74773a2-kube-api-access-fj68j\") pod \"node-resolver-8v9t9\" (UID: \"78ec080e-d24e-458b-8622-465dd74773a2\") " pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.312477 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78ec080e-d24e-458b-8622-465dd74773a2-hosts-file\") pod \"node-resolver-8v9t9\" (UID: \"78ec080e-d24e-458b-8622-465dd74773a2\") " pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.329991 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/78ec080e-d24e-458b-8622-465dd74773a2-kube-api-access-fj68j\") pod \"node-resolver-8v9t9\" (UID: \"78ec080e-d24e-458b-8622-465dd74773a2\") " pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.469932 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gfzvv"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.470278 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.471658 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-52vmw"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.472188 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-67k68"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.472360 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ffd2h"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.472580 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.475167 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.475877 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.476568 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.476885 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.476955 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.477070 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.477684 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.477755 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.477821 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.477885 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.477968 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8v9t9" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.478017 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.479797 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.479826 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.480433 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.489492 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.489835 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.490190 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.490365 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.490565 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.492465 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: W1205 20:12:01.496136 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ec080e_d24e_458b_8622_465dd74773a2.slice/crio-b9215bdc093edadffa76c9fda6677bb7242b97f6425aead9589a0e639ca9b89b WatchSource:0}: Error finding container b9215bdc093edadffa76c9fda6677bb7242b97f6425aead9589a0e639ca9b89b: Status 404 returned error can't find the container with id b9215bdc093edadffa76c9fda6677bb7242b97f6425aead9589a0e639ca9b89b Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.499078 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.505549 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516106 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-system-cni-dir\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516148 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cnibin\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516179 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-cnibin\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516200 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476kk\" (UniqueName: \"kubernetes.io/projected/1cc24b64-e25f-4b55-9123-295388685e7a-kube-api-access-476kk\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516219 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cni-binary-copy\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516236 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516253 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-hostroot\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516271 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-daemon-config\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516288 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqtj\" (UniqueName: \"kubernetes.io/projected/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-kube-api-access-nzqtj\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516308 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-k8s-cni-cncf-io\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516324 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-tuning-conf-dir\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516344 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-cni-bin\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516375 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cc24b64-e25f-4b55-9123-295388685e7a-mcd-auth-proxy-config\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516392 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49m2v\" (UniqueName: \"kubernetes.io/projected/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-kube-api-access-49m2v\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516413 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-cni-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516430 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-kubelet\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516446 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cc24b64-e25f-4b55-9123-295388685e7a-proxy-tls\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516478 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-os-release\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516495 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cc24b64-e25f-4b55-9123-295388685e7a-rootfs\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516513 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-cni-binary-copy\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516532 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-cni-multus\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516549 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-netns\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516570 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-multus-certs\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516588 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-host\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516604 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-serviceca\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516628 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-socket-dir-parent\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516663 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6gph\" (UniqueName: \"kubernetes.io/projected/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-kube-api-access-s6gph\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516832 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-conf-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516854 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-etc-kubernetes\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516880 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-os-release\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.516908 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-system-cni-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.524840 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.541033 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.553208 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.561817 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.571589 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.586664 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.596714 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.604506 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.615859 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.618841 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-cni-binary-copy\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.618910 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-cni-multus\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.618957 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cc24b64-e25f-4b55-9123-295388685e7a-rootfs\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619035 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-socket-dir-parent\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619087 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-netns\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619113 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-multus-certs\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619132 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-host\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619169 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-serviceca\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619197 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6gph\" (UniqueName: \"kubernetes.io/projected/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-kube-api-access-s6gph\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619233 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-conf-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619251 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-etc-kubernetes\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619276 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-os-release\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619320 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-system-cni-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619559 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-conf-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.620424 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-serviceca\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621220 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-cni-binary-copy\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621257 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-cni-multus\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621400 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-os-release\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621434 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-system-cni-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621456 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-netns\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621450 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-etc-kubernetes\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621492 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-socket-dir-parent\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621515 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-multus-certs\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.621549 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-host\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.622566 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-system-cni-dir\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.622598 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cc24b64-e25f-4b55-9123-295388685e7a-rootfs\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.619371 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-system-cni-dir\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626312 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cnibin\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626355 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-cnibin\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626407 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cni-binary-copy\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626445 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cnibin\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626462 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626530 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476kk\" (UniqueName: \"kubernetes.io/projected/1cc24b64-e25f-4b55-9123-295388685e7a-kube-api-access-476kk\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626561 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-hostroot\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626587 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-daemon-config\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626613 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqtj\" (UniqueName: \"kubernetes.io/projected/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-kube-api-access-nzqtj\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626636 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-k8s-cni-cncf-io\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626662 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-tuning-conf-dir\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626687 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-cni-bin\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626728 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cc24b64-e25f-4b55-9123-295388685e7a-mcd-auth-proxy-config\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626775 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-os-release\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626830 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-cnibin\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626803 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49m2v\" (UniqueName: \"kubernetes.io/projected/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-kube-api-access-49m2v\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626873 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-cni-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626896 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-kubelet\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.626917 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cc24b64-e25f-4b55-9123-295388685e7a-proxy-tls\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627296 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-run-k8s-cni-cncf-io\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627387 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-hostroot\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627439 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-cni-bin\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627527 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-tuning-conf-dir\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627587 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-os-release\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627701 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-cni-dir\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627768 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-host-var-lib-kubelet\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.627971 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cni-binary-copy\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.628079 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.628273 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-multus-daemon-config\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.628339 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cc24b64-e25f-4b55-9123-295388685e7a-mcd-auth-proxy-config\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.633348 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cc24b64-e25f-4b55-9123-295388685e7a-proxy-tls\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.645089 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.646582 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6gph\" (UniqueName: \"kubernetes.io/projected/5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea-kube-api-access-s6gph\") pod \"multus-gfzvv\" (UID: \"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\") " pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.661588 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476kk\" (UniqueName: \"kubernetes.io/projected/1cc24b64-e25f-4b55-9123-295388685e7a-kube-api-access-476kk\") pod \"machine-config-daemon-ffd2h\" (UID: \"1cc24b64-e25f-4b55-9123-295388685e7a\") " pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.662200 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqtj\" (UniqueName: \"kubernetes.io/projected/37ff8a0c-1191-4afd-8bc7-1b18fac7e568-kube-api-access-nzqtj\") pod \"node-ca-67k68\" (UID: \"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\") " pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.662880 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49m2v\" (UniqueName: \"kubernetes.io/projected/4ef1f1c8-4ced-4af5-80b0-404e1f6f8796-kube-api-access-49m2v\") pod \"multus-additional-cni-plugins-52vmw\" (UID: \"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\") " pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.685891 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.686980 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.688519 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.693882 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.696643 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.697464 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.698777 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.699798 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.700555 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.702032 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.702748 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.703914 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.705461 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.706856 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.708485 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.709435 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.710682 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.711593 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.712517 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.716737 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.716923 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.717692 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.718691 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.722294 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.722925 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.724597 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.725851 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.726661 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.727450 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.727528 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.727558 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.727585 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727643 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:02.727624221 +0000 UTC m=+21.538840330 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.727670 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727692 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727750 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:02.727731624 +0000 UTC m=+21.538947733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727777 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727793 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727804 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727835 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:02.727829147 +0000 UTC m=+21.539045256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727868 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727886 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:02.727881058 +0000 UTC m=+21.539097167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727922 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727931 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727939 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.727959 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:02.72795388 +0000 UTC m=+21.539169989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.727976 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.728633 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.729383 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.729867 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.730541 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.731169 4904 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.732158 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.736996 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.737731 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.738876 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.741401 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.742199 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.743156 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.743829 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.744773 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.745378 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.745353 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.746894 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.748313 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.748926 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.749819 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.750381 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.751376 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.752374 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.752750 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.752885 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.753799 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.754391 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.754886 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.755839 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.756324 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.770341 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.778999 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.787969 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.795555 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.799077 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gfzvv" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.800974 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.810473 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: W1205 20:12:01.811210 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fcfb250_f7e5_4ae1_9c49_43a68e8de9ea.slice/crio-9a886bf8322dc7b9281c41030ae506e281770fb1d47d7af55bc6a2f2e105e897 WatchSource:0}: Error finding container 9a886bf8322dc7b9281c41030ae506e281770fb1d47d7af55bc6a2f2e105e897: Status 404 returned error can't find the container with id 9a886bf8322dc7b9281c41030ae506e281770fb1d47d7af55bc6a2f2e105e897 Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.815283 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.825581 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dsvd6"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.826457 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.827532 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.829498 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.829774 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.829773 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.829947 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.829893 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.830178 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.831245 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.838943 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-52vmw" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.839481 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-67k68" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.844021 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.853110 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8v9t9" event={"ID":"78ec080e-d24e-458b-8622-465dd74773a2","Type":"ContainerStarted","Data":"b9215bdc093edadffa76c9fda6677bb7242b97f6425aead9589a0e639ca9b89b"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.855373 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.855415 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.855426 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aff4492d9b286cfc37e1b3c5f9c6e6424f667dad175146fd44deb66239a1b31f"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.857642 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.857912 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.858115 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.861752 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe" exitCode=255 Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.861834 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.861889 4904 scope.go:117] "RemoveContainer" containerID="b235f3c19ca851a68486ee1fc60d79c1676b86049bdfe492b290dcc150f1d243" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.864359 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"174a6a8ceadadc2a7c38654adc11382d08e6ad77435b359ccb921bf4d884c6d4"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.867849 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerStarted","Data":"9a886bf8322dc7b9281c41030ae506e281770fb1d47d7af55bc6a2f2e105e897"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.869081 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.869878 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.869903 4904 scope.go:117] "RemoveContainer" containerID="eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe" Dec 05 20:12:01 crc kubenswrapper[4904]: E1205 20:12:01.870139 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.872713 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2476a4fc54ec72811c1f0bf5e880ad5ede67b90d0298cbc8294be0ac58175a6e"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.879182 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7de842e60202f3a833e47386d70c6c5b61a6cdb130109fb127a00408549fa5e3"} Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.882083 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.895439 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:01 crc kubenswrapper[4904]: W1205 20:12:01.896706 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef1f1c8_4ced_4af5_80b0_404e1f6f8796.slice/crio-066cf01faa25e7615dd44c2115556fb44771faad146fd0383f3e755dc486178a WatchSource:0}: Error finding container 066cf01faa25e7615dd44c2115556fb44771faad146fd0383f3e755dc486178a: Status 404 returned error can't find the container with id 066cf01faa25e7615dd44c2115556fb44771faad146fd0383f3e755dc486178a Dec 05 20:12:01 crc kubenswrapper[4904]: W1205 20:12:01.897587 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ff8a0c_1191_4afd_8bc7_1b18fac7e568.slice/crio-2a2e4092337d9bc655f81009bd91943f26b1e06e368814c74b6a1fde015d5f88 WatchSource:0}: Error finding container 2a2e4092337d9bc655f81009bd91943f26b1e06e368814c74b6a1fde015d5f88: Status 404 returned error can't find the container with id 2a2e4092337d9bc655f81009bd91943f26b1e06e368814c74b6a1fde015d5f88 Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.914662 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928505 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928576 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-slash\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928597 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-var-lib-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928619 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-etc-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928641 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-config\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928664 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-env-overrides\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928718 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7kb\" (UniqueName: \"kubernetes.io/projected/55fbdf03-712c-4abc-9847-225fe63052e3-kube-api-access-2d7kb\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928740 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-bin\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928767 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-script-lib\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.928862 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-netd\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929360 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-kubelet\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929393 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-systemd\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929429 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-netns\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929454 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929475 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55fbdf03-712c-4abc-9847-225fe63052e3-ovn-node-metrics-cert\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929920 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-log-socket\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.929976 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-ovn\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.930004 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-systemd-units\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.930045 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.930119 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-node-log\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:01 crc kubenswrapper[4904]: I1205 20:12:01.972358 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.003781 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.015459 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.030853 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7kb\" (UniqueName: \"kubernetes.io/projected/55fbdf03-712c-4abc-9847-225fe63052e3-kube-api-access-2d7kb\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.030901 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-bin\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.030926 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-script-lib\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.030948 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-netd\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.030968 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-kubelet\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.030991 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-systemd\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031038 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-netns\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031079 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031101 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55fbdf03-712c-4abc-9847-225fe63052e3-ovn-node-metrics-cert\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031124 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-log-socket\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031145 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-ovn\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031165 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-systemd-units\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031184 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031202 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-node-log\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031239 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031281 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-slash\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031302 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-var-lib-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031321 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-etc-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031390 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-config\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.031417 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-env-overrides\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032095 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-env-overrides\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032163 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-log-socket\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032449 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-bin\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032488 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-systemd\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032519 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032494 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-ovn\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032532 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032552 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-node-log\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032589 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-var-lib-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032494 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-netns\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032613 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-etc-openvswitch\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032618 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-netd\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032534 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-systemd-units\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032647 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-slash\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032559 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-kubelet\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.032736 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.033125 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-script-lib\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.033164 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-config\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.037088 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55fbdf03-712c-4abc-9847-225fe63052e3-ovn-node-metrics-cert\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.038626 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.054998 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7kb\" (UniqueName: \"kubernetes.io/projected/55fbdf03-712c-4abc-9847-225fe63052e3-kube-api-access-2d7kb\") pod \"ovnkube-node-dsvd6\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.057181 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.079127 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b235f3c19ca851a68486ee1fc60d79c1676b86049bdfe492b290dcc150f1d243\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:11:55Z\\\",\\\"message\\\":\\\"W1205 20:11:44.368769 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1205 20:11:44.369123 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764965504 cert, and key in /tmp/serving-cert-677171729/serving-signer.crt, /tmp/serving-cert-677171729/serving-signer.key\\\\nI1205 20:11:44.566485 1 observer_polling.go:159] Starting file observer\\\\nW1205 20:11:44.569157 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1205 20:11:44.569489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:11:44.570475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-677171729/tls.crt::/tmp/serving-cert-677171729/tls.key\\\\\\\"\\\\nF1205 20:11:55.115359 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.093711 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.117015 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.133591 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.142121 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.153777 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: W1205 20:12:02.157123 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55fbdf03_712c_4abc_9847_225fe63052e3.slice/crio-3c9a83d11af8ca2c49877b8fd19d374d3069afd010660d8308d1f3279747c330 WatchSource:0}: Error finding container 3c9a83d11af8ca2c49877b8fd19d374d3069afd010660d8308d1f3279747c330: Status 404 returned error can't find the container with id 3c9a83d11af8ca2c49877b8fd19d374d3069afd010660d8308d1f3279747c330 Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.172250 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.187185 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.200726 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.680888 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.680951 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.681072 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.681163 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.681223 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.681364 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.738660 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.738791 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.738807 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:04.738784131 +0000 UTC m=+23.550000240 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.738840 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.738865 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.738890 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739036 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739077 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739093 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739139 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739156 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739192 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739036 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739192 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:04.739152941 +0000 UTC m=+23.550369050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739234 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:04.739225363 +0000 UTC m=+23.550441472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739248 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:04.739241634 +0000 UTC m=+23.550457743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739252 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.739438 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:04.739385888 +0000 UTC m=+23.550602177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.883888 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.887096 4904 scope.go:117] "RemoveContainer" containerID="eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe" Dec 05 20:12:02 crc kubenswrapper[4904]: E1205 20:12:02.887272 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.888012 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.888048 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.889335 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerStarted","Data":"6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.891142 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ef1f1c8-4ced-4af5-80b0-404e1f6f8796" containerID="91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.891220 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerDied","Data":"91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.891244 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerStarted","Data":"066cf01faa25e7615dd44c2115556fb44771faad146fd0383f3e755dc486178a"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.892228 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8v9t9" event={"ID":"78ec080e-d24e-458b-8622-465dd74773a2","Type":"ContainerStarted","Data":"b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.893824 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.895047 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.895090 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.895127 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"3c9a83d11af8ca2c49877b8fd19d374d3069afd010660d8308d1f3279747c330"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.896773 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-67k68" event={"ID":"37ff8a0c-1191-4afd-8bc7-1b18fac7e568","Type":"ContainerStarted","Data":"4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.896799 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-67k68" event={"ID":"37ff8a0c-1191-4afd-8bc7-1b18fac7e568","Type":"ContainerStarted","Data":"2a2e4092337d9bc655f81009bd91943f26b1e06e368814c74b6a1fde015d5f88"} Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.903104 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.920710 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.940486 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.952418 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.966748 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.981745 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:02 crc kubenswrapper[4904]: I1205 20:12:02.998679 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.046956 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.063450 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.089132 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.110724 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.125521 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.137193 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.150342 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.171444 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.185451 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.202394 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.232785 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.252145 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.266977 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.281805 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.293823 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.307367 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.325196 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.348983 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.364403 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.902530 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.902857 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.902868 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.902876 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.902885 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.902894 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.904195 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ef1f1c8-4ced-4af5-80b0-404e1f6f8796" containerID="6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44" exitCode=0 Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.904291 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerDied","Data":"6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44"} Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.923513 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.939079 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.963887 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.978799 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:03 crc kubenswrapper[4904]: I1205 20:12:03.998643 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.014457 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.029386 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.042810 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.053220 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.068242 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.083920 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.100670 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.116812 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.428036 4904 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.430682 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.430746 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.430767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.430926 4904 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.441378 4904 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.441742 4904 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.443367 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.443431 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.443458 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.443487 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.443510 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.466100 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.470540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.470607 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.470626 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.470651 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.470668 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.492603 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.497721 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.497782 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.497799 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.497822 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.497840 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.519829 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.524858 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.524911 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.524928 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.524953 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.524970 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.544986 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.549470 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.549552 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.549570 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.549596 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.549619 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.569194 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.569359 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.571051 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.571135 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.571155 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.571180 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.571200 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.674597 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.674699 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.674717 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.674741 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.674757 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.681053 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.681143 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.681105 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.681263 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.681409 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.681505 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.758987 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.759199 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759252 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:08.759222079 +0000 UTC m=+27.570438198 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.759311 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.759358 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759379 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.759409 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759469 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:08.759441625 +0000 UTC m=+27.570657774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759490 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759555 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:08.759535577 +0000 UTC m=+27.570751746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759570 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759617 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759632 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759651 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759686 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:08.759675641 +0000 UTC m=+27.570891870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759700 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759718 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.759809 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:08.759783444 +0000 UTC m=+27.570999573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.777566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.777598 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.777609 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.777624 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.777635 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.843938 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.844740 4904 scope.go:117] "RemoveContainer" containerID="eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe" Dec 05 20:12:04 crc kubenswrapper[4904]: E1205 20:12:04.844964 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.880434 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.880467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.880476 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.880491 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.880501 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.910443 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ef1f1c8-4ced-4af5-80b0-404e1f6f8796" containerID="36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a" exitCode=0 Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.910492 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerDied","Data":"36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.912930 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.940941 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.957423 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.970737 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.982902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.982975 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.982999 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.983029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.983054 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.983911 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4904]: I1205 20:12:04.997926 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.018403 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.030969 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.055920 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.069187 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.082215 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.086629 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.086719 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.086739 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.086798 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.086818 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.098278 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.112872 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.130389 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.148844 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.167805 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.184771 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.189383 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.189455 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.189474 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.189505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.189523 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.198929 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.210954 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.222595 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.249771 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.266745 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.286956 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.291873 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.291919 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.291932 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.291954 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.291969 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.303609 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.317248 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.332603 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.350983 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.394417 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.394468 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.394478 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.394495 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.394508 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.497220 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.497266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.497275 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.497290 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.497300 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.600947 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.601217 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.601228 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.601246 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.601258 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.703979 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.704015 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.704024 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.704035 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.704047 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.807467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.807522 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.807540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.807563 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.807579 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.911101 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.911157 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.911168 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.911185 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.911197 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.927144 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ef1f1c8-4ced-4af5-80b0-404e1f6f8796" containerID="a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072" exitCode=0 Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.927234 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerDied","Data":"a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072"} Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.946383 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.972075 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:05 crc kubenswrapper[4904]: I1205 20:12:05.991916 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.005007 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.014040 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.014110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.014123 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.014138 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.015258 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.016331 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.028671 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.042277 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.053366 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.069698 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.083853 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.096830 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.113799 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.117644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.117679 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.117688 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.117709 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.117720 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.127034 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.220360 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.220401 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.220413 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.220430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.220441 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.322826 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.322881 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.322898 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.322922 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.322940 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.425921 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.425974 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.425992 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.426014 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.426030 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.529214 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.529255 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.529264 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.529279 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.529292 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.632640 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.632687 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.632698 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.632715 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.632726 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.680286 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.680340 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:06 crc kubenswrapper[4904]: E1205 20:12:06.680459 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.680834 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:06 crc kubenswrapper[4904]: E1205 20:12:06.680911 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:06 crc kubenswrapper[4904]: E1205 20:12:06.680975 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.735419 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.735471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.735488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.735509 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.735524 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.837847 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.837875 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.837883 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.837895 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.837903 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.932715 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.935463 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ef1f1c8-4ced-4af5-80b0-404e1f6f8796" containerID="70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d" exitCode=0 Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.935505 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerDied","Data":"70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.940436 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.940505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.940524 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.940562 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.940580 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.954165 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.971508 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:06 crc kubenswrapper[4904]: I1205 20:12:06.989711 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.004312 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.013802 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.030444 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.043475 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.043508 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.043516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.043531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.043539 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.049643 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.065471 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.078175 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.104230 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.120840 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.138171 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.147096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.147139 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.147149 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.147167 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.147177 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.152798 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.250467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.250511 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.250521 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.250536 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.250546 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.353430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.353466 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.353490 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.353507 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.353518 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.455920 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.455965 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.455980 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.456001 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.456016 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.558437 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.559190 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.559231 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.559251 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.559264 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.661926 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.661974 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.661986 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.662004 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.662017 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.764838 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.764923 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.764965 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.764998 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.765021 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.867510 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.867591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.867628 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.867656 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.867676 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.943978 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ef1f1c8-4ced-4af5-80b0-404e1f6f8796" containerID="2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4" exitCode=0 Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.944117 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerDied","Data":"2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.970855 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.970908 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.970920 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.970941 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.970841 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.970956 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.984629 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:07 crc kubenswrapper[4904]: I1205 20:12:07.999203 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.010628 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.029683 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.046934 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.067013 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.073735 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.073764 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.073772 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.073785 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.073795 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.084161 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.094409 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.094846 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.099308 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.102574 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.106509 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.122676 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.137045 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.147932 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.165878 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.176620 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.176650 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.176658 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.176671 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.176680 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.177368 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.189607 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.203585 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.218811 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.230579 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.241680 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.251419 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.270012 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.279376 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.279416 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.279426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.279443 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.279455 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.282763 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.295039 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.308135 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.323881 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.339229 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.382879 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.382922 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.382935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.382952 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.382965 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.486326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.486390 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.486405 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.486429 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.486445 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.589309 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.589350 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.589360 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.589376 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.589386 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.680584 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.680753 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.680821 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.680922 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.681204 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.681332 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.691773 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.691817 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.691829 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.691853 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.691881 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.794529 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.794571 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.794584 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.794602 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.794614 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.800134 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.800245 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.800286 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.800313 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.800346 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800447 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:16.800427194 +0000 UTC m=+35.611643303 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800456 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800483 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800527 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:16.800510767 +0000 UTC m=+35.611726876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800537 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800558 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:16.800536297 +0000 UTC m=+35.611752416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800566 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800567 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800583 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800593 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800599 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800645 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:16.80063419 +0000 UTC m=+35.611850429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:08 crc kubenswrapper[4904]: E1205 20:12:08.800674 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:16.800654871 +0000 UTC m=+35.611871020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.896697 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.896743 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.896756 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.896773 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.896785 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.954013 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" event={"ID":"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796","Type":"ContainerStarted","Data":"140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.967869 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a"} Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.968176 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.968222 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.975744 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.990653 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.994500 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.997840 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.998837 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.998875 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.998887 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.998903 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4904]: I1205 20:12:08.998917 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.009025 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.026676 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.049260 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.065001 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.076987 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.093166 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.101593 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.101646 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.101660 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.101681 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.101693 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.110661 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.124840 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.138138 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.161190 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.174808 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.190307 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.204680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.204745 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.204758 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.204781 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.204817 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.205499 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.217643 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.231808 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.247141 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.262545 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.281043 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.296109 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.307046 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.307129 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.307142 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.307166 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.307189 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.311847 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.326896 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.341141 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.354890 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.374266 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.391318 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.409583 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.410050 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.410086 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.410095 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.410107 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.410115 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.513673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.513719 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.513727 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.513743 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.513753 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.616966 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.617045 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.617073 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.617098 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.617115 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.720329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.720404 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.720413 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.720430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.720440 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.822656 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.822756 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.822771 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.822796 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.822811 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.926521 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.926579 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.926591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.926613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.926630 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4904]: I1205 20:12:09.971781 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.030047 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.030113 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.030126 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.030145 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.030160 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.133124 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.133155 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.133163 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.133176 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.133185 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.235466 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.235531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.235554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.235635 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.235662 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.338488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.338565 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.338579 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.338597 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.338611 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.447979 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.448035 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.448046 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.448082 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.448095 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.551478 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.551537 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.551547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.551568 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.551580 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.654226 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.654265 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.654277 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.654292 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.654302 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.681003 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.681086 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.681093 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:10 crc kubenswrapper[4904]: E1205 20:12:10.681176 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:10 crc kubenswrapper[4904]: E1205 20:12:10.681289 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:10 crc kubenswrapper[4904]: E1205 20:12:10.681375 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.757085 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.757175 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.757211 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.757246 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.757274 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.861292 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.861343 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.861355 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.861375 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.861388 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.965862 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.965931 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.965955 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.965983 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.966007 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4904]: I1205 20:12:10.975506 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.069295 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.069347 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.069359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.069378 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.069393 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.171862 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.171925 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.171942 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.171967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.171984 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.274627 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.274717 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.274740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.274766 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.274785 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.377739 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.377783 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.377797 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.377816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.377828 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.481408 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.481470 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.481488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.481513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.481531 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.584295 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.584356 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.584379 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.584406 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.584423 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.687346 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.687423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.687440 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.687465 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.687478 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.695955 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.715146 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.733351 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.747965 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.760179 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.775476 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.790413 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.790450 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.790461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.790477 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.790489 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.795903 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.809102 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.823393 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.836437 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.847052 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.857671 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.872644 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.887704 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.892680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.892724 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.892736 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.892754 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.892769 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.980127 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/0.log" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.982961 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a" exitCode=1 Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.983004 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.983611 4904 scope.go:117] "RemoveContainer" containerID="7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.997861 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.997889 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.997897 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.997910 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.997921 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4904]: I1205 20:12:11.998905 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.014122 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.027335 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.040827 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.055327 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.070493 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.083933 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.095975 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.099423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.099454 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.099482 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.099500 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.099511 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.107143 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.118656 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.129428 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.144291 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.155080 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.176630 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:11Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:12:11.008936 6194 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009181 6194 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009389 6194 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009536 6194 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.009565 6194 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1205 20:12:11.009752 6194 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.010052 6194 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:12:11.010115 6194 factory.go:656] Stopping watch factory\\\\nI1205 20:12:11.010141 6194 ovnkube.go:599] Stopped ovnkube\\\\nI1205 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.201399 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.201443 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.201452 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.201467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.201477 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.303613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.303649 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.303657 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.303678 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.303688 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.406437 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.406490 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.406508 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.406531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.406547 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.508983 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.509029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.509041 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.509169 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.509196 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.611623 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.611655 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.611663 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.611675 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.611683 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.680239 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.680281 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.680310 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:12 crc kubenswrapper[4904]: E1205 20:12:12.680364 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:12 crc kubenswrapper[4904]: E1205 20:12:12.680530 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:12 crc kubenswrapper[4904]: E1205 20:12:12.680615 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.714742 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.714797 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.714808 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.714828 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.714840 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.817267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.817304 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.817317 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.817330 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.817339 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.919418 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.919467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.919488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.919509 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.919522 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.988644 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/0.log" Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.992935 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864"} Dec 05 20:12:12 crc kubenswrapper[4904]: I1205 20:12:12.993202 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.008526 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.022320 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.022359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.022368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.022384 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.022394 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.030991 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.049036 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.060764 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.072525 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.086888 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.106665 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.120515 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.125360 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.125390 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.125417 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.125432 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.125441 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.134593 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.151080 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.169150 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.192161 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.208561 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.227298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.227333 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.227341 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.227355 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.227366 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.232817 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:11Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:12:11.008936 6194 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009181 6194 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009389 6194 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009536 6194 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.009565 6194 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1205 20:12:11.009752 6194 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.010052 6194 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:12:11.010115 6194 factory.go:656] Stopping watch factory\\\\nI1205 20:12:11.010141 6194 ovnkube.go:599] Stopped ovnkube\\\\nI1205 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:13Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.330008 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.330368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.330446 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.330534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.330610 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.433800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.433832 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.433842 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.433856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.433865 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.536131 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.536520 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.536689 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.536836 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.536984 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.640359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.640440 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.640454 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.640472 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.640485 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.743891 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.743963 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.743977 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.744005 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.744022 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.847313 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.847396 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.847408 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.847442 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.847460 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.950085 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.950120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.950130 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.950144 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.950157 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4904]: I1205 20:12:13.999084 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/1.log" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.000044 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/0.log" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.003690 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864" exitCode=1 Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.003743 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.003791 4904 scope.go:117] "RemoveContainer" containerID="7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.004955 4904 scope.go:117] "RemoveContainer" containerID="754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864" Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.005238 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.022516 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.037837 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.052767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.052814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.052827 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.052849 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.052865 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.063838 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:11Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:12:11.008936 6194 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009181 6194 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009389 6194 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009536 6194 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.009565 6194 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1205 20:12:11.009752 6194 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.010052 6194 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:12:11.010115 6194 factory.go:656] Stopping watch factory\\\\nI1205 20:12:11.010141 6194 ovnkube.go:599] Stopped ovnkube\\\\nI1205 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.082652 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.109562 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.122357 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.136233 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.148814 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.154844 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.154873 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.154882 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.154896 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.154905 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.165779 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.181081 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.194933 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.212643 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.224113 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.236862 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.257753 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.258014 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.258227 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.258368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.258496 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.332674 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs"] Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.333452 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.336014 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.336510 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.361347 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.361445 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.362038 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.362113 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.362151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.362173 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.375484 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.393800 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.409954 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.426936 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.440682 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.453973 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.458426 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.458465 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.458530 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9wkw\" (UniqueName: \"kubernetes.io/projected/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-kube-api-access-r9wkw\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.458609 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.465119 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.465117 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.465161 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.465265 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.465285 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.465297 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.478651 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.491823 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.506087 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.517776 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.551004 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:11Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:12:11.008936 6194 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009181 6194 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009389 6194 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009536 6194 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.009565 6194 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1205 20:12:11.009752 6194 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.010052 6194 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:12:11.010115 6194 factory.go:656] Stopping watch factory\\\\nI1205 20:12:11.010141 6194 ovnkube.go:599] Stopped ovnkube\\\\nI1205 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.559603 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9wkw\" (UniqueName: \"kubernetes.io/projected/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-kube-api-access-r9wkw\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.559722 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.559768 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.559817 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.561008 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.561468 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.566962 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.567652 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.568238 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.568297 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.568322 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.568352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.568372 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.583337 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.601289 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9wkw\" (UniqueName: \"kubernetes.io/projected/b5f4b5ef-d8d0-4591-a19c-a8347f74e833-kube-api-access-r9wkw\") pod \"ovnkube-control-plane-749d76644c-t96qs\" (UID: \"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.659732 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.671363 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.671435 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.671458 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.671488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.671510 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: W1205 20:12:14.676007 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f4b5ef_d8d0_4591_a19c_a8347f74e833.slice/crio-840a8a06d69b8283217176d38382da53141fbcf85cff7bd89b6593c5661fd02d WatchSource:0}: Error finding container 840a8a06d69b8283217176d38382da53141fbcf85cff7bd89b6593c5661fd02d: Status 404 returned error can't find the container with id 840a8a06d69b8283217176d38382da53141fbcf85cff7bd89b6593c5661fd02d Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.680372 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.680411 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.680475 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.680527 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.680668 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.680856 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.731312 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.731385 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.731397 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.731415 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.731427 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.746333 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.751043 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.751099 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.751110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.751126 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.751137 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.765159 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.768935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.768986 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.768999 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.769017 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.769028 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.780499 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.784247 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.784313 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.784336 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.784367 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.784389 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.797011 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.800857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.800893 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.800905 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.800923 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.800932 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.815828 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:14Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:14 crc kubenswrapper[4904]: E1205 20:12:14.815989 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.817974 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.818026 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.818040 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.818076 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.818088 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.920947 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.920997 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.921011 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.921029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4904]: I1205 20:12:14.921040 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.009398 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/1.log" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.016585 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" event={"ID":"b5f4b5ef-d8d0-4591-a19c-a8347f74e833","Type":"ContainerStarted","Data":"840a8a06d69b8283217176d38382da53141fbcf85cff7bd89b6593c5661fd02d"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.023770 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.023807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.023819 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.023836 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.023851 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.125852 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.125909 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.125926 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.125950 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.125967 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.229014 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.229115 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.229134 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.229159 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.229177 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.331834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.331889 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.331909 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.331933 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.331950 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.434481 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.434539 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.434555 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.434579 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.434597 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.454458 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-d8xkk"] Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.455238 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:15 crc kubenswrapper[4904]: E1205 20:12:15.455521 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.473901 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.493624 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.551938 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.551980 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.551992 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.552009 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.552020 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.567962 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fjl\" (UniqueName: \"kubernetes.io/projected/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-kube-api-access-m7fjl\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.568208 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.574025 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.585117 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.597260 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.618929 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.637563 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.651554 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.655825 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.655848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.655857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.655871 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.655881 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.666289 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.668904 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.668948 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fjl\" (UniqueName: \"kubernetes.io/projected/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-kube-api-access-m7fjl\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:15 crc kubenswrapper[4904]: E1205 20:12:15.669137 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:15 crc kubenswrapper[4904]: E1205 20:12:15.669223 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:16.169203169 +0000 UTC m=+34.980419368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.683567 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.695723 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fjl\" (UniqueName: \"kubernetes.io/projected/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-kube-api-access-m7fjl\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.702655 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.713983 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.731897 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:11Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:12:11.008936 6194 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009181 6194 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009389 6194 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009536 6194 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.009565 6194 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1205 20:12:11.009752 6194 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.010052 6194 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:12:11.010115 6194 factory.go:656] Stopping watch factory\\\\nI1205 20:12:11.010141 6194 ovnkube.go:599] Stopped ovnkube\\\\nI1205 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.748203 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.758828 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.758868 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.758878 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.758893 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.758904 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.762798 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.777176 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.861473 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.861517 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.861530 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.861549 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.861563 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.964290 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.964319 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.964329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.964345 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:15 crc kubenswrapper[4904]: I1205 20:12:15.964357 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:15Z","lastTransitionTime":"2025-12-05T20:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.022264 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" event={"ID":"b5f4b5ef-d8d0-4591-a19c-a8347f74e833","Type":"ContainerStarted","Data":"7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.022315 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" event={"ID":"b5f4b5ef-d8d0-4591-a19c-a8347f74e833","Type":"ContainerStarted","Data":"6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.040928 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.057612 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.066761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.067029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.067232 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.067600 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.067743 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.075164 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.088793 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.108275 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dda21ae586da86cd92ac5775c3cfe70a9bdcb3c568311f285327c27a845fe2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:11Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:12:11.008936 6194 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009181 6194 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009389 6194 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:12:11.009536 6194 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.009565 6194 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1205 20:12:11.009752 6194 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:12:11.010052 6194 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:12:11.010115 6194 factory.go:656] Stopping watch factory\\\\nI1205 20:12:11.010141 6194 ovnkube.go:599] Stopped ovnkube\\\\nI1205 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.123381 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.134135 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.144680 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.153601 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.163381 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.169998 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.170022 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.170031 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.170044 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.170053 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.172609 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.172760 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.172811 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:17.172797404 +0000 UTC m=+35.984013513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.176753 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.185144 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.199407 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.213908 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.224542 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.235119 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.272724 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.272812 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.272832 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.272857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.272874 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.376139 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.376194 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.376211 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.376235 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.376251 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.479205 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.479262 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.479278 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.479300 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.479318 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.582571 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.582632 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.582653 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.582682 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.582744 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.680289 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.680343 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.680289 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.680459 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.680970 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.681204 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.681369 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.681463 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.685218 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.685267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.685279 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.685295 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.685308 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.787916 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.787993 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.788014 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.788039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.788096 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.879467 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.879631 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:32.879597427 +0000 UTC m=+51.690813576 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.879699 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.879810 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.879870 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.879923 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.879968 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880007 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880021 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880039 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880123 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880136 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880178 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:32.880149632 +0000 UTC m=+51.691365771 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880178 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880207 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880212 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:32.880189273 +0000 UTC m=+51.691405412 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880307 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:32.880293256 +0000 UTC m=+51.691509405 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:16 crc kubenswrapper[4904]: E1205 20:12:16.880331 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:32.880319576 +0000 UTC m=+51.691535725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.891450 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.891514 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.891533 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.891567 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.891593 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.995266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.995323 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.995339 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.995364 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:16 crc kubenswrapper[4904]: I1205 20:12:16.995382 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:16Z","lastTransitionTime":"2025-12-05T20:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.098265 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.098330 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.098339 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.098359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.098379 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.184250 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:17 crc kubenswrapper[4904]: E1205 20:12:17.184476 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:17 crc kubenswrapper[4904]: E1205 20:12:17.184709 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:19.184688915 +0000 UTC m=+37.995905024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.200489 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.200531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.200540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.200613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.200630 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.303482 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.303529 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.303540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.303552 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.303561 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.406163 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.406228 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.406241 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.406257 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.406267 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.508278 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.508326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.508341 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.508362 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.508376 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.611382 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.611428 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.611438 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.611455 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.611466 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.681482 4904 scope.go:117] "RemoveContainer" containerID="eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.758844 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.758934 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.758960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.758988 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.759014 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.861083 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.861112 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.861123 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.861138 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.861149 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.963902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.964144 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.964229 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.964304 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:17 crc kubenswrapper[4904]: I1205 20:12:17.964367 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:17Z","lastTransitionTime":"2025-12-05T20:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.067366 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.067437 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.067461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.067491 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.067509 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.170618 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.170667 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.170686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.170709 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.170727 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.272672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.272717 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.272728 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.272746 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.272759 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.374668 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.374716 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.374727 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.374750 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.374761 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.480666 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.481289 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.481310 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.481336 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.481354 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.583734 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.583790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.583807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.583831 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.583848 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.654726 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.655464 4904 scope.go:117] "RemoveContainer" containerID="754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864" Dec 05 20:12:18 crc kubenswrapper[4904]: E1205 20:12:18.655605 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.667702 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.680554 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.680577 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.680573 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.680632 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:18 crc kubenswrapper[4904]: E1205 20:12:18.680693 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:18 crc kubenswrapper[4904]: E1205 20:12:18.680810 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:18 crc kubenswrapper[4904]: E1205 20:12:18.680975 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:18 crc kubenswrapper[4904]: E1205 20:12:18.681030 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.683773 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.686110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.686158 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.686171 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.686193 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.686207 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.701533 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.714484 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.729863 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.747017 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.761402 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.772874 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.786220 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.788193 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.788322 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.788414 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.788494 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.788565 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.801446 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.814262 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.824361 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.844028 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.857904 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.869858 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.883780 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.891342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.891382 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.891393 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.891408 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.891419 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.993768 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.993818 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.993835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.993857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:18 crc kubenswrapper[4904]: I1205 20:12:18.993876 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:18Z","lastTransitionTime":"2025-12-05T20:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.035562 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.037630 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.037971 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.058609 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.071985 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.083772 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.095417 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.096591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.096645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.096662 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.096684 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.096701 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.121533 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.134618 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.149636 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.162312 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.174050 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.189406 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.199198 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.199244 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.199259 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.199278 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.199294 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.204783 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:19 crc kubenswrapper[4904]: E1205 20:12:19.204943 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:19 crc kubenswrapper[4904]: E1205 20:12:19.204997 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:23.204980469 +0000 UTC m=+42.016196588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.211770 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.226049 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.242098 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.259406 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.273830 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.288221 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.302235 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.302285 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.302302 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.302324 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.302340 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.405029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.405644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.405718 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.405811 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.405913 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.509206 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.509404 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.509513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.509655 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.509736 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.611924 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.612196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.612330 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.612421 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.612516 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.714992 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.715316 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.715456 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.715574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.715694 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.818966 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.819034 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.819100 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.819133 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.819159 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.921679 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.921745 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.921767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.921793 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:19 crc kubenswrapper[4904]: I1205 20:12:19.921808 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:19Z","lastTransitionTime":"2025-12-05T20:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.023994 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.024041 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.024052 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.024095 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.024106 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.125912 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.125945 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.125954 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.125968 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.125978 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.228770 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.228829 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.228851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.228882 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.228905 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.331729 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.331806 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.331823 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.331853 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.331873 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.435311 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.435380 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.435405 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.435434 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.435456 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.537721 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.537778 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.537787 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.537801 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.537811 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.640778 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.640853 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.640878 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.640911 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.640934 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.681236 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.681320 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.681334 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.681236 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:20 crc kubenswrapper[4904]: E1205 20:12:20.681501 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:20 crc kubenswrapper[4904]: E1205 20:12:20.681619 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:20 crc kubenswrapper[4904]: E1205 20:12:20.681787 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:20 crc kubenswrapper[4904]: E1205 20:12:20.681943 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.743658 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.743722 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.743739 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.743761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.743777 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.847767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.847847 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.847866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.847896 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.847920 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.952049 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.952181 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.952206 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.952240 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:20 crc kubenswrapper[4904]: I1205 20:12:20.952269 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:20Z","lastTransitionTime":"2025-12-05T20:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.055113 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.055189 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.055216 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.055303 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.055340 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.157890 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.157952 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.157969 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.157992 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.158012 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.261009 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.261119 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.261143 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.261173 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.261195 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.363839 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.363900 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.363921 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.363949 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.363967 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.467472 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.467538 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.467548 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.467568 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.467580 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.570448 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.570513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.570534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.570560 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.570581 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.673681 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.673745 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.673763 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.673790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.673806 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.703779 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.722302 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.747424 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.766326 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.776479 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.776547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.776564 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.776589 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.776607 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.798388 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.813158 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.831676 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.847139 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.862732 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.879397 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.879626 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.879864 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.880028 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.880227 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.880389 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.897645 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.912382 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.935265 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.952900 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.972024 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.982918 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.982982 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.983002 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.983027 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.983047 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:21Z","lastTransitionTime":"2025-12-05T20:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:21 crc kubenswrapper[4904]: I1205 20:12:21.998242 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.085686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.085775 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.085794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.085816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.085835 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.188445 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.188487 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.188496 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.188512 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.188523 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.292359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.292423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.292445 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.292475 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.292497 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.396145 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.396233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.396256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.396281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.396299 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.499310 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.499344 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.499355 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.499371 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.499382 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.601967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.602208 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.602238 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.602267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.602286 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.680333 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.680374 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.680364 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:22 crc kubenswrapper[4904]: E1205 20:12:22.680536 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.680579 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:22 crc kubenswrapper[4904]: E1205 20:12:22.680697 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:22 crc kubenswrapper[4904]: E1205 20:12:22.680819 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:22 crc kubenswrapper[4904]: E1205 20:12:22.680996 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.705141 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.705200 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.705223 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.705254 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.705276 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.807906 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.807967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.807991 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.808023 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.808044 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.910740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.910801 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.910834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.910859 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:22 crc kubenswrapper[4904]: I1205 20:12:22.911007 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:22Z","lastTransitionTime":"2025-12-05T20:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.014215 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.014280 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.014307 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.014338 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.014359 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.117271 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.117336 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.117354 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.117378 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.117394 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.220345 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.220420 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.220440 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.220486 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.220518 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.246440 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:23 crc kubenswrapper[4904]: E1205 20:12:23.246773 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:23 crc kubenswrapper[4904]: E1205 20:12:23.246903 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:31.246870704 +0000 UTC m=+50.058086893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.323141 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.323222 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.323240 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.323259 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.323274 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.425691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.425767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.425808 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.425838 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.425860 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.528780 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.528861 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.528889 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.528952 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.528979 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.632411 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.632478 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.632501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.632530 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.632555 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.735333 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.735401 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.735429 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.735461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.735487 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.838166 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.838225 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.838242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.838273 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.838290 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.941906 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.941957 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.941975 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.941998 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:23 crc kubenswrapper[4904]: I1205 20:12:23.942015 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:23Z","lastTransitionTime":"2025-12-05T20:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.045011 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.045147 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.045187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.045218 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.045240 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.148120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.148195 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.148219 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.148242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.148259 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.250135 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.250180 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.250191 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.250228 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.250241 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.353744 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.353809 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.353818 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.353832 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.353841 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.456518 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.456586 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.456604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.456628 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.456644 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.559522 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.559588 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.559600 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.559616 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.559629 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.663025 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.663125 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.663150 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.663182 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.663204 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.680436 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.680504 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.680656 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.680687 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.680742 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.680876 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.680983 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.681148 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.766358 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.766428 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.766451 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.766481 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.766503 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.839242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.839339 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.839358 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.839383 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.839404 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.859498 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.864798 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.865019 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.865044 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.865097 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.865115 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.885962 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.892630 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.892701 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.892733 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.892761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.892779 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.914854 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.920673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.920923 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.921097 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.921264 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.921522 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.940731 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.946598 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.946669 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.946686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.946710 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.946728 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.965881 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:24 crc kubenswrapper[4904]: E1205 20:12:24.966147 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.969628 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.969693 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.969715 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.969740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:24 crc kubenswrapper[4904]: I1205 20:12:24.969757 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:24Z","lastTransitionTime":"2025-12-05T20:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.072805 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.072849 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.072858 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.072871 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.072883 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.176425 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.176494 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.176514 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.176540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.176558 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.279513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.279583 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.279601 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.279623 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.279640 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.382516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.382563 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.382574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.382592 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.382604 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.485423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.485461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.485474 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.485493 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.485505 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.588365 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.588424 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.588444 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.588471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.588488 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.691029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.691096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.691109 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.691124 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.691135 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.795127 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.795169 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.795178 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.795191 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.795201 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.898151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.898191 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.898200 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.898214 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:25 crc kubenswrapper[4904]: I1205 20:12:25.898224 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:25Z","lastTransitionTime":"2025-12-05T20:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.000547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.000598 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.000611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.000628 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.000641 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.102797 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.102835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.102844 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.102865 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.102878 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.205271 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.205308 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.205316 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.205333 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.205343 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.307968 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.308097 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.308110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.308133 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.308146 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.411184 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.411244 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.411266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.411295 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.411316 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.514055 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.514195 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.514224 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.514260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.514288 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.617580 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.617617 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.617637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.617651 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.617665 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.680802 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.680807 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.680894 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.680940 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:26 crc kubenswrapper[4904]: E1205 20:12:26.681162 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:26 crc kubenswrapper[4904]: E1205 20:12:26.681313 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:26 crc kubenswrapper[4904]: E1205 20:12:26.681409 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:26 crc kubenswrapper[4904]: E1205 20:12:26.681553 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.721706 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.721752 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.721768 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.721791 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.721807 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.825703 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.825798 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.825822 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.825890 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.825913 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.929460 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.929527 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.929547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.929574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:26 crc kubenswrapper[4904]: I1205 20:12:26.929595 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:26Z","lastTransitionTime":"2025-12-05T20:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.032818 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.032895 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.032919 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.032949 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.032974 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.137689 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.137770 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.137790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.137818 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.137840 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.241318 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.241372 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.241386 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.241405 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.241420 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.344383 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.344441 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.344455 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.344479 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.344497 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.447747 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.447824 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.447848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.447879 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.447902 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.550242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.550286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.550298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.550314 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.550325 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.653814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.653859 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.653870 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.653887 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.653899 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.756700 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.756780 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.756801 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.756834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.756856 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.859936 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.860009 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.860031 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.860094 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.860116 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.963133 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.963218 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.963242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.963272 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:27 crc kubenswrapper[4904]: I1205 20:12:27.963293 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:27Z","lastTransitionTime":"2025-12-05T20:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.065962 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.066023 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.066039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.066611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.066689 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.170224 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.170284 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.170307 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.170335 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.170358 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.272683 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.272766 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.272792 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.272825 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.272847 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.375282 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.375336 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.375359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.375387 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.375408 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.478472 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.478533 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.478554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.478585 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.478608 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.581547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.581623 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.581640 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.581664 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.581684 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.680896 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.681006 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:28 crc kubenswrapper[4904]: E1205 20:12:28.681195 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.681227 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:28 crc kubenswrapper[4904]: E1205 20:12:28.681450 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:28 crc kubenswrapper[4904]: E1205 20:12:28.681581 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.680928 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:28 crc kubenswrapper[4904]: E1205 20:12:28.682081 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.683699 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.683856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.683979 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.684079 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.684145 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.787506 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.787835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.788028 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.788245 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.788377 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.892144 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.892216 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.892241 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.892271 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.892296 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.919180 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.933593 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.935202 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.956822 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.971568 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.989520 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.995230 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.995287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.995305 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.995330 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:28 crc kubenswrapper[4904]: I1205 20:12:28.995347 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:28Z","lastTransitionTime":"2025-12-05T20:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.011261 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.027495 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.039227 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.053042 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.069032 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.090367 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.097694 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.097738 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.097747 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.097760 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.097770 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.108410 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.126501 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.141889 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.156943 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.173841 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.193264 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.200461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.200740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.200815 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.200989 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.201099 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.303951 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.304054 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.304113 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.304143 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.304168 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.406648 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.406686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.406697 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.406711 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.406722 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.508943 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.508974 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.508983 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.508997 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.509009 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.611355 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.611399 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.611411 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.611426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.611438 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.714795 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.714853 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.714865 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.714885 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.714897 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.818036 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.818137 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.818160 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.818187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.818207 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.921415 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.921464 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.921475 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.921491 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:29 crc kubenswrapper[4904]: I1205 20:12:29.921502 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:29Z","lastTransitionTime":"2025-12-05T20:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.024210 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.024263 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.024273 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.024295 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.024310 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.127453 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.127538 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.127585 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.127609 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.127626 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.232571 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.232652 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.232667 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.232683 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.232715 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.335886 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.335948 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.335970 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.336002 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.336025 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.438977 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.439082 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.439110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.439143 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.439167 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.542359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.542405 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.542421 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.542436 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.542445 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.644632 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.644662 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.644680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.644692 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.644701 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.681157 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.681156 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.681227 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.681229 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:30 crc kubenswrapper[4904]: E1205 20:12:30.681294 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:30 crc kubenswrapper[4904]: E1205 20:12:30.681420 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:30 crc kubenswrapper[4904]: E1205 20:12:30.681464 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:30 crc kubenswrapper[4904]: E1205 20:12:30.681507 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.747293 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.747344 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.747354 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.747368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.747378 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.850080 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.850129 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.850139 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.850155 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.850167 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.953201 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.953369 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.953407 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.953434 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:30 crc kubenswrapper[4904]: I1205 20:12:30.953452 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:30Z","lastTransitionTime":"2025-12-05T20:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.056320 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.056353 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.056377 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.056389 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.056401 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.159397 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.159464 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.159483 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.159508 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.159529 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.262491 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.262545 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.262556 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.262572 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.262582 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.337239 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:31 crc kubenswrapper[4904]: E1205 20:12:31.337419 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:31 crc kubenswrapper[4904]: E1205 20:12:31.337536 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.337504471 +0000 UTC m=+66.148720630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.365512 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.365562 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.365574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.365591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.365602 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.467756 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.467795 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.467808 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.467822 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.467832 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.569827 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.569856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.569863 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.569878 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.569895 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.671630 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.671684 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.671694 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.671712 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.671724 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.694321 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.706277 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.720610 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.733430 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.749118 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.762826 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.774266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.774314 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.774335 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.774361 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.774380 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.775377 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.789896 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.809291 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.827848 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.845486 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.862610 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.875673 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.876644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.876668 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.876677 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.876688 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.876712 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.892497 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.905468 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.914159 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.936239 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.979106 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.979146 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.979156 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.979172 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:31 crc kubenswrapper[4904]: I1205 20:12:31.979182 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:31Z","lastTransitionTime":"2025-12-05T20:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.081705 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.081740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.081753 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.081769 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.081780 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.184744 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.184795 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.184807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.184834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.184851 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.287739 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.287822 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.287840 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.287864 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.287881 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.390435 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.390500 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.390517 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.390542 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.390561 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.493693 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.493757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.493779 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.493801 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.493815 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.596675 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.596716 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.596726 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.596741 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.596753 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.640399 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.664604 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.680480 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.680567 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.680726 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.680738 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.680772 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.680866 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.680930 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.680983 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.681861 4904 scope.go:117] "RemoveContainer" containerID="754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.687307 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.701100 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.701151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.701170 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.701193 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.701211 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.705870 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.720450 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.740210 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.755493 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.771754 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.782191 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.791867 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.803483 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.803672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.803790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.803874 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.803949 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.807531 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.823451 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.832609 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.843423 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.856567 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.869701 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.882723 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.894377 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.907178 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.907238 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.907253 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.907270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.907280 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:32Z","lastTransitionTime":"2025-12-05T20:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.951491 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.951636 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.951665 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951704 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:13:04.951668849 +0000 UTC m=+83.762884968 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.951765 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951846 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951866 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: I1205 20:12:32.951895 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951959 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951966 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:04.951935246 +0000 UTC m=+83.763151435 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951978 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951995 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.952044 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.952083 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:04.952036219 +0000 UTC m=+83.763252388 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.951877 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.952112 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:04.952101871 +0000 UTC m=+83.763318100 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.952118 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:32 crc kubenswrapper[4904]: E1205 20:12:32.952167 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:04.952155022 +0000 UTC m=+83.763371261 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.010500 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.010546 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.010560 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.010579 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.010590 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.094326 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/1.log" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.097525 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.098023 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.113212 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.113261 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.113270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.113294 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.113310 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.115641 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.128426 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.142868 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.169966 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.180653 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.190258 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.211868 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.215530 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.215566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.215579 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.215595 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.215607 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.225832 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.245336 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.264251 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.279694 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.296461 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.308023 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.317674 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.318240 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.318276 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.318286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.318300 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.318309 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.333778 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.344911 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.362740 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.421022 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.421078 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.421088 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.421102 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.421112 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.523414 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.523451 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.523459 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.523474 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.523483 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.626216 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.626282 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.626300 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.626326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.626344 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.728799 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.728831 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.728842 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.728857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.728867 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.832337 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.832386 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.832403 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.832426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.832440 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.934828 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.934880 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.934896 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.934916 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:33 crc kubenswrapper[4904]: I1205 20:12:33.934928 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:33Z","lastTransitionTime":"2025-12-05T20:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.037909 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.037990 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.038019 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.038048 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.038107 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.140224 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.140264 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.140273 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.140288 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.140302 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.243533 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.243594 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.243613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.243637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.243653 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.346976 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.347043 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.347096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.347124 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.347144 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.450619 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.450674 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.450688 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.450709 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.450726 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.553517 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.553577 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.553597 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.553623 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.553642 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.656700 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.656746 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.656757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.656779 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.656796 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.680684 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.680713 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.680883 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.680910 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:34 crc kubenswrapper[4904]: E1205 20:12:34.681007 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:34 crc kubenswrapper[4904]: E1205 20:12:34.681161 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:34 crc kubenswrapper[4904]: E1205 20:12:34.681269 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:34 crc kubenswrapper[4904]: E1205 20:12:34.681331 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.759614 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.759681 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.759695 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.759715 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.759727 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.862950 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.862989 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.863000 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.863019 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.863033 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.966471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.966536 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.966557 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.966587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:34 crc kubenswrapper[4904]: I1205 20:12:34.966609 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:34Z","lastTransitionTime":"2025-12-05T20:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.072397 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.072439 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.072450 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.072469 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.072480 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.114288 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/2.log" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.115181 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/1.log" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.118403 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470" exitCode=1 Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.118471 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.118521 4904 scope.go:117] "RemoveContainer" containerID="754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.119725 4904 scope.go:117] "RemoveContainer" containerID="59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.120015 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.133540 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.152150 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.169047 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.175003 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.175285 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.175328 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.175357 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.175377 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.182446 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.204279 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.222847 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.239419 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.239905 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.240010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.240148 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.240297 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.243307 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.260902 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.262459 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.265608 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.265658 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.265669 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.265690 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.265709 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.276672 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.277810 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.283173 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.283216 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.283227 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.283249 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.283261 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.290651 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.294593 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.299863 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.299910 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.299926 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.299949 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.299962 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.305506 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.311919 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.317521 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.317563 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.317619 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.317645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.317660 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.321980 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.329618 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: E1205 20:12:35.329771 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.332130 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.332165 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.332176 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.332196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.332209 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.341181 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.356475 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.368733 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.382620 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.396290 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.434561 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.434601 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.434612 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.434627 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.434635 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.537595 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.537673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.537692 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.537720 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.537743 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.639772 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.639813 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.639823 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.639839 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.639850 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.743115 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.743164 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.743174 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.743194 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.743209 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.846850 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.847299 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.847383 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.847470 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.847553 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.950704 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.950748 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.950761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.950779 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:35 crc kubenswrapper[4904]: I1205 20:12:35.950792 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:35Z","lastTransitionTime":"2025-12-05T20:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.053544 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.053599 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.053611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.053627 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.053637 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.125530 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/2.log" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.156435 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.156489 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.156504 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.156523 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.156533 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.259351 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.259394 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.259406 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.259429 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.259444 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.362574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.362633 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.362648 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.362670 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.362689 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.466159 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.466550 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.466621 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.466705 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.466775 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.569730 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.569786 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.569806 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.569833 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.569853 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.673384 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.673453 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.673477 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.673508 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.673530 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.680623 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.680648 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:36 crc kubenswrapper[4904]: E1205 20:12:36.681090 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.680774 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:36 crc kubenswrapper[4904]: E1205 20:12:36.681559 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.680719 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:36 crc kubenswrapper[4904]: E1205 20:12:36.681803 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:36 crc kubenswrapper[4904]: E1205 20:12:36.681154 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.777561 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.777920 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.778256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.778490 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.778709 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.881680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.881747 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.881767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.881788 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.881802 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.984933 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.984997 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.985021 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.985050 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:36 crc kubenswrapper[4904]: I1205 20:12:36.985111 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:36Z","lastTransitionTime":"2025-12-05T20:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.088013 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.088092 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.088110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.088134 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.088151 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.190854 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.190927 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.190949 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.190974 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.190994 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.294587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.294659 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.294679 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.294702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.294718 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.396979 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.397039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.397085 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.397118 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.397140 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.500240 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.500287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.500298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.500322 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.500334 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.603786 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.603839 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.603849 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.603867 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.603879 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.705725 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.705758 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.705765 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.705777 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.705785 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.808425 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.808479 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.808495 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.808515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.808526 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.910995 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.911100 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.911117 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.911137 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:37 crc kubenswrapper[4904]: I1205 20:12:37.911152 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:37Z","lastTransitionTime":"2025-12-05T20:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.015584 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.015635 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.015659 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.015680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.015697 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.119031 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.119105 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.119128 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.119154 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.119178 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.221374 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.221780 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.221953 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.222151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.222296 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.324877 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.324915 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.324927 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.324942 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.324953 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.427809 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.427863 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.427885 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.427917 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.427938 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.531709 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.531757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.531768 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.531787 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.531802 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.635422 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.635861 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.636017 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.636231 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.636365 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.681284 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.681344 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.681384 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:38 crc kubenswrapper[4904]: E1205 20:12:38.681551 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.681601 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:38 crc kubenswrapper[4904]: E1205 20:12:38.681774 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:38 crc kubenswrapper[4904]: E1205 20:12:38.681931 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:38 crc kubenswrapper[4904]: E1205 20:12:38.682109 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.739687 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.739757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.739775 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.739800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.739819 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.842032 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.842466 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.843007 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.843520 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.843964 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.948082 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.948321 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.948410 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.948501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:38 crc kubenswrapper[4904]: I1205 20:12:38.948580 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:38Z","lastTransitionTime":"2025-12-05T20:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.050799 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.050834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.050842 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.050857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.050866 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.153367 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.153403 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.153415 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.153433 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.153444 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.255948 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.255998 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.256012 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.256028 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.256039 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.359814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.359870 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.359889 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.359915 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.359934 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.462732 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.462795 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.462814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.462838 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.462857 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.565872 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.565927 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.565944 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.565967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.565984 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.669002 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.669122 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.669148 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.669179 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.669203 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.772424 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.772503 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.772547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.772575 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.772592 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.875447 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.875854 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.876098 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.876306 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.876476 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.980336 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.980401 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.980421 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.980449 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:39 crc kubenswrapper[4904]: I1205 20:12:39.980470 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:39Z","lastTransitionTime":"2025-12-05T20:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.083957 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.084016 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.084033 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.084093 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.084120 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.186779 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.186840 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.186856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.186879 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.186897 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.289741 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.290315 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.290460 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.290601 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.290729 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.393794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.393826 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.393837 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.393852 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.393863 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.500453 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.500534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.500552 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.500577 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.500602 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.603398 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.603458 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.603480 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.603508 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.603529 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.680372 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.680614 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:40 crc kubenswrapper[4904]: E1205 20:12:40.680756 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.680782 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:40 crc kubenswrapper[4904]: E1205 20:12:40.680857 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:40 crc kubenswrapper[4904]: E1205 20:12:40.680889 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.680419 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:40 crc kubenswrapper[4904]: E1205 20:12:40.681149 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.706285 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.706338 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.706353 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.706371 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.706383 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.808963 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.809117 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.809150 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.809186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.809211 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.912052 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.912249 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.912277 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.912312 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:40 crc kubenswrapper[4904]: I1205 20:12:40.912336 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:40Z","lastTransitionTime":"2025-12-05T20:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.016209 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.016583 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.016752 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.017012 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.017272 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.120673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.120716 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.120727 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.120742 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.120753 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.223136 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.223211 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.223250 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.223282 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.223307 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.325529 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.325574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.325587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.325604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.325617 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.428042 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.428335 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.428414 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.428513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.428597 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.572196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.572235 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.572246 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.572262 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.572273 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.674497 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.674561 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.674574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.674597 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.674611 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.692713 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.704579 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.719988 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.731677 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.744140 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.759571 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.775536 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.778241 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.778290 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.778303 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.780121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.780165 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.788241 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.803579 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.815348 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.831437 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.843578 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.855189 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.878663 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.882329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.882363 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.882376 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.882393 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.882404 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.894981 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.907119 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.918836 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.985473 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.985518 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.985533 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.985548 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:41 crc kubenswrapper[4904]: I1205 20:12:41.985559 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:41Z","lastTransitionTime":"2025-12-05T20:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.088783 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.088835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.088850 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.088872 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.088890 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.191789 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.192153 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.192165 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.192181 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.192192 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.295453 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.295525 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.295534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.295545 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.295556 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.398848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.398888 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.398899 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.398914 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.398926 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.501007 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.501075 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.501121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.501140 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.501151 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.604467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.604744 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.604815 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.604880 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.604935 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.680870 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.680897 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.680958 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:42 crc kubenswrapper[4904]: E1205 20:12:42.681115 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:42 crc kubenswrapper[4904]: E1205 20:12:42.681367 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:42 crc kubenswrapper[4904]: E1205 20:12:42.681515 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.681233 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:42 crc kubenswrapper[4904]: E1205 20:12:42.681756 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.708104 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.708423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.708598 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.708746 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.708880 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.812248 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.812294 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.812306 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.812322 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.812334 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.915918 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.916000 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.916022 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.916052 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:42 crc kubenswrapper[4904]: I1205 20:12:42.916112 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:42Z","lastTransitionTime":"2025-12-05T20:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.018685 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.018723 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.018734 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.018749 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.018762 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.121163 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.121220 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.121235 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.121256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.121271 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.223542 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.223591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.223602 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.223616 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.223627 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.326349 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.326404 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.326414 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.326426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.326435 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.428814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.428855 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.428866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.428883 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.428896 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.532085 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.532135 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.532149 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.532168 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.532185 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.634644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.634693 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.634705 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.634725 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.634740 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.737672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.737720 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.737729 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.737745 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.737756 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.840684 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.840749 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.840769 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.840794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.840811 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.943948 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.944008 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.944020 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.944041 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:43 crc kubenswrapper[4904]: I1205 20:12:43.944077 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:43Z","lastTransitionTime":"2025-12-05T20:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.046637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.046692 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.046706 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.046724 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.046736 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.149450 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.149492 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.149503 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.149519 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.149531 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.252547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.252605 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.252614 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.252642 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.252653 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.355835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.355881 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.355893 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.355911 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.355923 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.459544 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.459794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.459803 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.459821 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.459831 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.562711 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.562784 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.562798 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.562820 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.562831 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.666192 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.666283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.666304 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.666327 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.666377 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.680552 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.680591 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.680631 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.680558 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:44 crc kubenswrapper[4904]: E1205 20:12:44.680726 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:44 crc kubenswrapper[4904]: E1205 20:12:44.680840 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:44 crc kubenswrapper[4904]: E1205 20:12:44.680945 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:44 crc kubenswrapper[4904]: E1205 20:12:44.681033 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.768802 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.768849 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.768860 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.768875 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.768886 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.871703 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.871756 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.871768 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.871785 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.871797 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.974748 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.974790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.974802 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.974816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:44 crc kubenswrapper[4904]: I1205 20:12:44.974828 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:44Z","lastTransitionTime":"2025-12-05T20:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.078689 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.078740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.078752 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.078770 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.078784 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.181718 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.181791 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.181819 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.181851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.181873 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.284640 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.284699 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.284718 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.284742 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.284759 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.387507 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.387552 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.387563 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.387580 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.387592 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.490015 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.490120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.490145 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.490175 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.490199 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.552797 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.552825 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.552834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.552846 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.552855 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: E1205 20:12:45.566166 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.570543 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.570569 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.570576 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.570588 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.570597 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: E1205 20:12:45.583037 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.586591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.586617 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.586630 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.586646 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.586658 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: E1205 20:12:45.599655 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.604585 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.604616 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.604626 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.604641 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.604652 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: E1205 20:12:45.615697 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.619691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.619725 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.619735 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.619760 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.619772 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: E1205 20:12:45.651193 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:45 crc kubenswrapper[4904]: E1205 20:12:45.651344 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.654357 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.654395 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.654409 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.655223 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.655248 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.758446 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.758485 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.758498 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.758542 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.758558 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.861089 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.861114 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.861125 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.861139 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.861149 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.964624 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.964660 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.964677 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.964692 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:45 crc kubenswrapper[4904]: I1205 20:12:45.964704 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:45Z","lastTransitionTime":"2025-12-05T20:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.067432 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.067476 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.067485 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.067499 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.067509 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.169307 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.169360 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.169369 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.169382 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.169391 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.272508 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.272568 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.272584 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.272606 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.272622 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.404988 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.405045 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.405069 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.405086 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.405095 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.506980 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.507008 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.507016 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.507026 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.507034 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.609252 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.609291 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.609312 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.609325 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.609333 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.680977 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.681022 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.681008 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.681002 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:46 crc kubenswrapper[4904]: E1205 20:12:46.681128 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:46 crc kubenswrapper[4904]: E1205 20:12:46.681191 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:46 crc kubenswrapper[4904]: E1205 20:12:46.681247 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:46 crc kubenswrapper[4904]: E1205 20:12:46.681320 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.711811 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.711846 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.711857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.711873 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.711884 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.814573 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.814612 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.814622 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.814637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.814647 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.917523 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.917564 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.917572 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.917588 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:46 crc kubenswrapper[4904]: I1205 20:12:46.917598 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:46Z","lastTransitionTime":"2025-12-05T20:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.019802 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.019844 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.019856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.019873 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.019885 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.122117 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.122149 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.122161 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.122175 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.122185 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.224081 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.224118 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.224130 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.224147 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.224158 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.326264 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.326303 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.326311 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.326324 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.326334 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.426378 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:47 crc kubenswrapper[4904]: E1205 20:12:47.426692 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:47 crc kubenswrapper[4904]: E1205 20:12:47.427182 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:19.427163242 +0000 UTC m=+98.238379351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.429452 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.429487 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.429497 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.429512 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.429522 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.532932 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.532996 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.533013 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.533036 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.533087 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.635939 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.635982 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.635992 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.636007 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.636016 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.739268 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.739317 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.739329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.739345 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.739357 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.842183 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.842225 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.842236 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.842253 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.842268 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.944556 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.944618 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.944639 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.944665 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:47 crc kubenswrapper[4904]: I1205 20:12:47.944685 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:47Z","lastTransitionTime":"2025-12-05T20:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.047787 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.047838 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.047854 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.047875 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.047892 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.150632 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.150715 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.150737 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.150771 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.150797 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.252761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.252803 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.252813 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.252828 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.252837 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.355569 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.355695 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.355714 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.355737 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.355755 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.458130 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.458189 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.458207 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.458233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.458251 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.561321 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.561370 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.561384 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.561404 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.561419 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.663384 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.663422 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.663432 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.663444 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.663453 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.680588 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.680650 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.680655 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:48 crc kubenswrapper[4904]: E1205 20:12:48.680704 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.680751 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:48 crc kubenswrapper[4904]: E1205 20:12:48.680808 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:48 crc kubenswrapper[4904]: E1205 20:12:48.680921 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:48 crc kubenswrapper[4904]: E1205 20:12:48.681025 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.766140 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.766199 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.766217 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.766240 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.766258 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.868983 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.869042 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.869074 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.869091 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.869101 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.972438 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.972490 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.972502 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.972523 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:48 crc kubenswrapper[4904]: I1205 20:12:48.972536 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:48Z","lastTransitionTime":"2025-12-05T20:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.075547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.075580 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.075591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.075607 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.075617 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.171313 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/0.log" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.171383 4904 generic.go:334] "Generic (PLEG): container finished" podID="5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea" containerID="6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d" exitCode=1 Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.171432 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerDied","Data":"6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.171984 4904 scope.go:117] "RemoveContainer" containerID="6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.184470 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.184543 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.184554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.184576 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.184594 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.189125 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.208212 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.230439 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.243684 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.258580 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.273337 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.288124 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.288151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.288159 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.288171 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.288182 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.289826 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.305918 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.325317 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.337528 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.352384 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.368039 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.380424 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.390210 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.390235 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.390242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.390254 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.390265 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.400155 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://754660b165d5db2e99f7a7f53bb46148e13677fd0e632c71885bcc93f8c01864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:13Z\\\",\\\"message\\\":\\\"r/kube-controller-manager-crc in node crc\\\\nI1205 20:12:13.100315 6323 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100853 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1205 20:12:13.100859 6323 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1205 20:12:13.100847 6323 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-8v9t9 after 0 failed attempt(s)\\\\nI1205 20:12:13.100867 6323 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-8v9t9\\\\nI1205 20:12:13.100858 6323 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1205 20:12:13.100879 6323 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:12:13.100887 6323 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.416487 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.427523 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.443135 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.492686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.492716 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.492728 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.492741 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.492752 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.595862 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.595897 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.595908 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.595924 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.595934 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.681575 4904 scope.go:117] "RemoveContainer" containerID="59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470" Dec 05 20:12:49 crc kubenswrapper[4904]: E1205 20:12:49.681797 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.698633 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.698681 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.698691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.698708 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.698719 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.701163 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.715973 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.727550 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.748789 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.759196 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.772379 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.786392 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.801224 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.801524 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.801532 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.801547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.801557 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.804860 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.818812 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.836342 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.854905 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.868536 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.880342 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.895402 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.904148 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.904187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.904199 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.904215 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.904226 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:49Z","lastTransitionTime":"2025-12-05T20:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.923689 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.939784 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:49 crc kubenswrapper[4904]: I1205 20:12:49.954265 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.006074 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.006110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.006120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.006134 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.006142 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.107552 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.107584 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.107591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.107604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.107613 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.177199 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/0.log" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.177281 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerStarted","Data":"7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.193862 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.209606 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.209673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.209686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.209702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.209714 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.209755 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.223882 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.236895 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.249189 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.266927 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.277528 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.292305 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.307104 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.311326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.311394 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.311409 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.311430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.311446 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.322436 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.336678 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.348548 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.362021 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.373260 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.385214 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.395720 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.412489 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.413476 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.413497 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.413505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.413517 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.413525 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.516004 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.516078 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.516097 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.516115 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.516141 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.618935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.618985 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.618997 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.619018 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.619031 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.681194 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.681231 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.681259 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.681340 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:50 crc kubenswrapper[4904]: E1205 20:12:50.681426 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:50 crc kubenswrapper[4904]: E1205 20:12:50.681554 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:50 crc kubenswrapper[4904]: E1205 20:12:50.681661 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:50 crc kubenswrapper[4904]: E1205 20:12:50.681710 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.721690 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.721729 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.721738 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.721751 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.721760 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.824159 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.824212 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.824239 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.824263 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.824279 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.926776 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.926848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.926867 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.926892 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:50 crc kubenswrapper[4904]: I1205 20:12:50.926909 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:50Z","lastTransitionTime":"2025-12-05T20:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.029664 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.029707 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.029717 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.029732 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.029743 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.132775 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.132846 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.132884 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.132921 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.132944 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.235653 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.235708 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.235725 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.235748 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.235765 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.338623 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.338673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.338686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.338703 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.338715 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.441424 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.441485 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.441497 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.441516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.441530 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.543939 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.543993 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.544005 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.544041 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.544053 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.646765 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.646791 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.646799 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.646811 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.646819 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.699645 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.717443 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.733853 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.750973 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.751080 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.751204 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.751217 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.751237 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.751250 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.768241 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.782491 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.811955 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.832454 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.845280 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.853769 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.853805 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.853816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.853833 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.853842 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.860259 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.874745 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.889266 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.908083 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.923487 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.938269 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.949018 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.955751 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.955779 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.955787 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.955800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.955810 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:51Z","lastTransitionTime":"2025-12-05T20:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:51 crc kubenswrapper[4904]: I1205 20:12:51.970764 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.058432 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.058491 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.058505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.058551 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.058561 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.162006 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.162048 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.162080 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.162098 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.162110 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.264493 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.264555 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.264567 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.264587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.264599 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.366739 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.366800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.366817 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.366878 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.366897 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.469718 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.469799 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.469816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.469843 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.469862 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.571696 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.572496 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.572647 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.572794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.572910 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.675292 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.675702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.675994 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.676223 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.676370 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.680489 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.680640 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.680590 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.680574 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:52 crc kubenswrapper[4904]: E1205 20:12:52.681020 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:52 crc kubenswrapper[4904]: E1205 20:12:52.680916 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:52 crc kubenswrapper[4904]: E1205 20:12:52.680766 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:52 crc kubenswrapper[4904]: E1205 20:12:52.681459 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.780221 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.780330 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.780355 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.780385 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.780409 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.882917 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.882958 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.882969 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.882983 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.882993 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.985881 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.985930 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.985944 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.985961 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:52 crc kubenswrapper[4904]: I1205 20:12:52.985973 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:52Z","lastTransitionTime":"2025-12-05T20:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.088712 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.088760 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.088771 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.088789 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.088802 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.190499 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.190577 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.190592 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.190611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.190624 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.293096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.293151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.293165 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.293186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.293204 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.395558 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.395612 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.395624 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.395642 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.395655 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.498562 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.498640 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.498654 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.498676 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.498698 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.601743 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.601797 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.601810 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.601825 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.601837 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.704190 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.704243 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.704252 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.704270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.704284 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.807209 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.807283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.807296 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.807320 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.807336 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.910184 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.910238 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.910256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.910279 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:53 crc kubenswrapper[4904]: I1205 20:12:53.910299 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:53Z","lastTransitionTime":"2025-12-05T20:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.013121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.013163 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.013179 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.013200 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.013215 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.115902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.115947 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.115957 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.115973 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.115985 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.218531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.218589 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.218600 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.218624 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.218637 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.320353 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.320387 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.320396 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.320408 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.320417 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.423121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.423207 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.423230 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.423261 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.423285 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.525466 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.525515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.525537 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.525559 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.525573 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.628439 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.628514 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.628527 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.628544 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.628558 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.680895 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.680947 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.681102 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.681091 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:54 crc kubenswrapper[4904]: E1205 20:12:54.681122 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:54 crc kubenswrapper[4904]: E1205 20:12:54.681236 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:54 crc kubenswrapper[4904]: E1205 20:12:54.681371 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:54 crc kubenswrapper[4904]: E1205 20:12:54.681461 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.731224 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.731281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.731293 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.731312 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.731325 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.833457 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.833547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.833566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.833591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.833611 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.935723 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.935816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.935827 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.935846 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:54 crc kubenswrapper[4904]: I1205 20:12:54.935859 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:54Z","lastTransitionTime":"2025-12-05T20:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.039717 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.039776 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.039790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.039814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.039829 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.143124 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.143184 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.143201 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.143226 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.143243 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.246463 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.246565 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.246596 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.246634 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.246660 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.348637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.348700 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.348721 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.348744 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.348760 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.451553 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.451611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.451624 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.451644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.451662 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.554004 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.554046 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.554085 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.554110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.554122 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.656331 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.656383 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.656393 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.656413 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.656425 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.760008 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.760119 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.760134 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.760158 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.760172 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.772939 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.773001 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.773015 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.773035 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.773049 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: E1205 20:12:55.797924 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.803149 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.803187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.803199 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.803214 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.803225 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: E1205 20:12:55.820096 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.824093 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.824186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.824208 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.824234 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.824253 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: E1205 20:12:55.842710 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.847467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.847607 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.847637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.847662 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.847680 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: E1205 20:12:55.864951 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.870417 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.870470 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.870489 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.870513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.870529 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: E1205 20:12:55.890818 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:55 crc kubenswrapper[4904]: E1205 20:12:55.891039 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.892932 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.892995 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.893020 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.893049 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.893102 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.996222 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.996308 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.996335 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.996364 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:55 crc kubenswrapper[4904]: I1205 20:12:55.996430 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:55Z","lastTransitionTime":"2025-12-05T20:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.099097 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.099172 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.099187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.099212 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.099228 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.201960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.202020 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.202037 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.202098 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.202117 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.304910 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.304981 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.304998 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.305027 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.305045 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.408189 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.408245 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.408260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.408278 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.408290 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.511039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.511165 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.511190 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.511276 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.511330 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.614264 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.614371 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.614389 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.614412 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.614428 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.680253 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.680319 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.680277 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:56 crc kubenswrapper[4904]: E1205 20:12:56.680411 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.680251 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:56 crc kubenswrapper[4904]: E1205 20:12:56.680584 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:56 crc kubenswrapper[4904]: E1205 20:12:56.680605 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:56 crc kubenswrapper[4904]: E1205 20:12:56.680833 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.716931 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.716987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.716999 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.717020 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.717034 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.819781 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.819823 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.819835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.819853 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.819866 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.922644 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.922695 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.922713 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.922735 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:56 crc kubenswrapper[4904]: I1205 20:12:56.922751 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:56Z","lastTransitionTime":"2025-12-05T20:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.025976 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.026024 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.026036 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.026053 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.026081 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.128310 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.128392 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.128422 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.128450 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.128471 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.230851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.230896 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.230906 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.230924 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.230934 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.333422 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.333485 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.333496 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.333519 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.333532 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.435645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.435696 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.435708 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.435725 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.435737 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.538441 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.538491 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.538507 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.538529 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.538546 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.640743 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.640790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.640810 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.640832 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.640848 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.744211 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.745154 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.745170 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.745232 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.745245 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.849118 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.849175 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.849190 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.849208 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.849222 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.952364 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.952429 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.952447 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.952471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:57 crc kubenswrapper[4904]: I1205 20:12:57.952489 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:57Z","lastTransitionTime":"2025-12-05T20:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.056193 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.056273 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.056298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.056329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.056353 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.159207 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.159261 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.159278 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.159301 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.159319 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.261800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.261838 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.261847 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.261865 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.261876 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.364411 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.364451 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.364460 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.364475 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.364485 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.467627 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.467690 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.467710 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.467736 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.467754 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.570590 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.570672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.570696 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.570723 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.570749 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.674281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.674364 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.674389 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.674419 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.674441 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.680778 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.680813 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.680886 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.680920 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:58 crc kubenswrapper[4904]: E1205 20:12:58.681129 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:58 crc kubenswrapper[4904]: E1205 20:12:58.681231 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:12:58 crc kubenswrapper[4904]: E1205 20:12:58.681375 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:58 crc kubenswrapper[4904]: E1205 20:12:58.681495 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.777239 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.777301 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.777312 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.777328 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.777339 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.879822 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.879920 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.879938 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.879961 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.880017 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.983087 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.983147 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.983166 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.983187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:58 crc kubenswrapper[4904]: I1205 20:12:58.983202 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:58Z","lastTransitionTime":"2025-12-05T20:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.086481 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.086560 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.086579 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.086608 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.086625 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.189714 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.189815 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.189837 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.189856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.189866 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.291857 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.291899 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.291921 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.291941 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.291954 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.395244 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.395315 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.395337 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.395366 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.395387 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.497962 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.498031 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.498048 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.498084 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.498101 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.600938 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.600995 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.601046 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.601102 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.601121 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.703488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.703536 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.703554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.703577 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.703595 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.806054 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.806128 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.806142 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.806162 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.806176 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.908927 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.908987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.909002 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.909022 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:59 crc kubenswrapper[4904]: I1205 20:12:59.909036 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:59Z","lastTransitionTime":"2025-12-05T20:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.012471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.012641 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.012674 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.012704 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.012728 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.115989 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.116044 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.116053 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.116082 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.116091 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.219536 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.219604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.219621 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.219645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.219661 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.323308 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.323412 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.323430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.323462 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.323480 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.426151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.426180 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.426189 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.426204 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.426232 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.529215 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.529257 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.529268 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.529283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.529294 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.631871 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.631922 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.631934 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.631955 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.631970 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.680469 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.680525 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.680518 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.680449 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:00 crc kubenswrapper[4904]: E1205 20:13:00.680688 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:00 crc kubenswrapper[4904]: E1205 20:13:00.680882 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:00 crc kubenswrapper[4904]: E1205 20:13:00.681142 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:00 crc kubenswrapper[4904]: E1205 20:13:00.681894 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.734441 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.734519 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.734529 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.734551 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.734563 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.838368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.838402 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.838451 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.838469 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.838479 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.940867 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.940913 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.940928 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.940951 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:00 crc kubenswrapper[4904]: I1205 20:13:00.940965 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:00Z","lastTransitionTime":"2025-12-05T20:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.043673 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.043759 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.043781 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.043814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.043838 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.146738 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.146823 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.146843 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.146869 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.146887 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.249501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.249569 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.249586 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.249611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.249630 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.351780 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.351852 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.351871 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.351895 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.351912 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.454724 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.454807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.454830 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.454860 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.454882 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.558257 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.558299 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.558309 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.558323 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.558333 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.661995 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.662072 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.662088 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.662104 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.662113 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.694713 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.716171 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.744562 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.759983 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.764678 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.764708 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.764716 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.764731 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.764741 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.780609 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.791295 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.807811 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.821489 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.836204 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.849977 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.863823 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.868202 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.868269 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.868288 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.868313 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.868333 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.883689 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.895739 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.926428 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.948255 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.963715 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.971141 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.971206 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.971225 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.971252 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.971272 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:01Z","lastTransitionTime":"2025-12-05T20:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:01 crc kubenswrapper[4904]: I1205 20:13:01.984238 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.073573 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.073655 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.073680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.073711 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.073736 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.177419 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.177536 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.177560 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.177593 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.177615 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.280935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.281349 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.281607 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.281841 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.282122 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.384920 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.385000 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.385034 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.385100 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.385123 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.488010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.488120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.488142 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.488168 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.488186 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.592173 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.592535 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.592678 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.592811 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.592946 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.680903 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.680900 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:02 crc kubenswrapper[4904]: E1205 20:13:02.681348 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.680900 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:02 crc kubenswrapper[4904]: E1205 20:13:02.681456 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.680947 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:02 crc kubenswrapper[4904]: E1205 20:13:02.681561 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:02 crc kubenswrapper[4904]: E1205 20:13:02.681268 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.696283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.696316 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.696326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.696341 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.696351 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.799226 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.799272 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.799283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.799302 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.799314 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.902341 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.902393 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.902410 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.902434 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:02 crc kubenswrapper[4904]: I1205 20:13:02.902452 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:02Z","lastTransitionTime":"2025-12-05T20:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.004429 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.004501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.004513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.004531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.004546 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.107236 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.107280 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.107291 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.107306 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.107316 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.210522 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.210581 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.210594 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.210614 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.210630 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.314047 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.314128 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.314144 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.314167 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.314184 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.417310 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.417428 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.417447 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.417476 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.417494 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.519986 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.520096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.520121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.520155 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.520178 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.623260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.623344 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.623367 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.623458 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.623484 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.683301 4904 scope.go:117] "RemoveContainer" containerID="59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.725879 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.726198 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.726217 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.726335 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.726381 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.829326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.829378 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.829396 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.829443 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.829483 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.968376 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.968442 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.968459 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.968482 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:03 crc kubenswrapper[4904]: I1205 20:13:03.968502 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:03Z","lastTransitionTime":"2025-12-05T20:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.072114 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.072173 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.072185 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.072214 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.072229 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.174719 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.174775 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.174801 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.174824 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.174840 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.277749 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.277851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.277869 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.277893 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.277911 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.380848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.380899 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.380910 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.380928 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.380942 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.483626 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.483675 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.483686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.483703 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.483716 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.586758 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.586800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.586811 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.586827 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.586837 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.680314 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.680418 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.680585 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.680597 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:04 crc kubenswrapper[4904]: E1205 20:13:04.680533 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:04 crc kubenswrapper[4904]: E1205 20:13:04.680663 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:04 crc kubenswrapper[4904]: E1205 20:13:04.680735 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:04 crc kubenswrapper[4904]: E1205 20:13:04.680833 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.689001 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.689080 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.689097 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.689114 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.689126 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.791315 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.791351 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.791359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.791371 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.791380 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.893594 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.893634 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.893643 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.893656 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.893665 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.996030 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.996099 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.996111 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.996130 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:04 crc kubenswrapper[4904]: I1205 20:13:04.996140 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:04Z","lastTransitionTime":"2025-12-05T20:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.018777 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.018879 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.018905 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.018923 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.018949 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:09.018926489 +0000 UTC m=+147.830142598 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.018990 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019026 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019042 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019053 4904 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019101 4904 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019121 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:14:09.019106454 +0000 UTC m=+147.830322563 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019101 4904 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019110 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019336 4904 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019357 4904 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019158 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:14:09.019144425 +0000 UTC m=+147.830360554 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019422 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:14:09.019399732 +0000 UTC m=+147.830615851 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:13:05 crc kubenswrapper[4904]: E1205 20:13:05.019443 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:14:09.019432343 +0000 UTC m=+147.830648472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.098294 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.098340 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.098352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.098373 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.098385 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.200839 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.200890 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.200902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.200921 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.200930 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.228173 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/2.log" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.231242 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.231652 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.249354 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.262128 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.275591 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.293577 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.303372 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.303467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.303504 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.303512 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.303527 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.303537 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.316602 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.328129 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.342345 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.355395 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.366530 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.380101 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.397557 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.407913 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.407959 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.407967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.407984 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.407994 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.409298 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.418031 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.434532 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:13:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.445340 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.454869 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.510420 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.510464 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.510472 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.510487 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.510496 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.612434 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.612506 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.612530 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.612563 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.612587 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.715593 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.715641 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.715652 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.715668 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.715679 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.819196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.819276 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.819301 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.819331 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.819355 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.921714 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.921776 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.921793 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.921816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:05 crc kubenswrapper[4904]: I1205 20:13:05.921832 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:05Z","lastTransitionTime":"2025-12-05T20:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.025027 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.025161 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.025201 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.025232 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.025253 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.044898 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.044971 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.044996 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.045023 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.045039 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.060226 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.065169 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.065274 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.065306 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.065335 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.065357 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.082667 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.087984 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.088022 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.088031 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.088049 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.088081 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.105681 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.109186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.109218 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.109230 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.109246 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.109258 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.127223 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.131843 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.131884 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.131895 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.131913 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.131922 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.148986 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.149249 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.151235 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.151299 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.151392 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.151466 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.151493 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.237906 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/3.log" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.238894 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/2.log" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.242912 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" exitCode=1 Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.243119 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.243175 4904 scope.go:117] "RemoveContainer" containerID="59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.244233 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.244655 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.254838 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.254902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.254916 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.254939 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.254954 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.264040 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.282874 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.294370 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.316341 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59d91343958cca34483f9737ebe1072f132d381e7a1d980d9c718063ad95f470\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:34Z\\\",\\\"message\\\":\\\":{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:12:33.904027 6587 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:12:33.904087 6587 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 20:12:33.904035 6587 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:12:33.904171 6587 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:13:05Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:13:05.358749 7009 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:13:05.358788 7009 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:13:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.331900 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.344269 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.360101 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.360176 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.360199 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.360228 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.360249 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.361816 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.373863 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.385878 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.399875 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.410676 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.424624 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.441359 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.457353 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.462652 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.462690 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.462702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.462720 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.462734 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.472834 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.483047 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.491890 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.566126 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.566173 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.566184 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.566219 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.566231 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.669196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.669260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.669281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.669308 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.669331 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.681226 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.681328 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.681350 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.681419 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.681364 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.681616 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.681676 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:06 crc kubenswrapper[4904]: E1205 20:13:06.681736 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.772628 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.772689 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.772707 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.772732 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.772749 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.875757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.875845 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.875874 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.875906 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.875932 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.979270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.979315 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.979325 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.979344 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:06 crc kubenswrapper[4904]: I1205 20:13:06.979356 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:06Z","lastTransitionTime":"2025-12-05T20:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.082618 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.082693 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.082719 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.082746 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.082764 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.186564 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.186932 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.186955 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.186985 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.187006 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.249256 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/3.log" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.253949 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:13:07 crc kubenswrapper[4904]: E1205 20:13:07.254129 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.269814 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.284366 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.289952 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.290017 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.290039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.290092 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.290109 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.299665 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.317636 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.332048 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.345655 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.359113 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.380888 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.396223 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.396272 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.396283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.396301 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.396313 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.400594 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.419341 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.438042 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.455113 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.469246 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.482488 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.498105 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.498138 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.498146 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.498158 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.498167 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.511541 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:13:05Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:13:05.358749 7009 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:13:05.358788 7009 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:13:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.527367 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.545045 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.601332 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.601386 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.601395 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.601410 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.601419 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.704349 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.704409 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.704430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.704454 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.704474 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.807547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.807621 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.807658 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.807693 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.807740 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.911691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.911751 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.911761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.911781 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:07 crc kubenswrapper[4904]: I1205 20:13:07.911793 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:07Z","lastTransitionTime":"2025-12-05T20:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.014977 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.015033 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.015043 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.015092 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.015105 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.118446 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.118521 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.118530 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.118546 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.118557 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.221980 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.222039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.222051 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.222089 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.222104 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.325667 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.325724 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.325741 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.325763 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.325778 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.428992 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.429092 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.429112 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.429137 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.429155 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.532040 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.532149 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.532192 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.532217 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.532235 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.635495 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.635533 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.635546 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.635568 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.635581 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.680463 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.680546 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.680564 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:08 crc kubenswrapper[4904]: E1205 20:13:08.680705 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:08 crc kubenswrapper[4904]: E1205 20:13:08.680607 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.680475 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:08 crc kubenswrapper[4904]: E1205 20:13:08.681050 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:08 crc kubenswrapper[4904]: E1205 20:13:08.681147 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.738866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.738924 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.738941 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.738964 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.738982 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.841458 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.841507 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.841524 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.841549 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.841568 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.944446 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.944515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.944535 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.944559 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:08 crc kubenswrapper[4904]: I1205 20:13:08.944576 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:08Z","lastTransitionTime":"2025-12-05T20:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.049757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.049825 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.049844 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.049870 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.049887 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.153313 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.153392 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.153418 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.153447 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.153469 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.256189 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.256255 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.256274 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.256299 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.256316 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.359229 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.359271 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.359281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.359296 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.359307 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.462707 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.462751 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.462761 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.462778 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.462788 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.566186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.566276 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.566301 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.566329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.566349 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.669760 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.669847 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.669871 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.669902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.669927 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.772809 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.772860 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.772877 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.772902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.772919 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.875558 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.875626 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.875649 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.875678 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.875696 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.978688 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.978798 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.978817 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.978843 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:09 crc kubenswrapper[4904]: I1205 20:13:09.978861 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:09Z","lastTransitionTime":"2025-12-05T20:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.081810 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.081854 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.081868 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.081884 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.081895 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.184603 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.184654 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.184667 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.184687 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.184703 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.287471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.287591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.287613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.287642 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.287663 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.390966 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.391026 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.391043 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.391119 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.391153 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.494233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.494302 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.494319 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.494342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.494361 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.597397 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.597555 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.597587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.597636 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.597660 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.680238 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.680271 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:10 crc kubenswrapper[4904]: E1205 20:13:10.680427 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.680489 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.680489 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:10 crc kubenswrapper[4904]: E1205 20:13:10.680655 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:10 crc kubenswrapper[4904]: E1205 20:13:10.680728 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:10 crc kubenswrapper[4904]: E1205 20:13:10.680814 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.700430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.700469 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.700479 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.700494 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.700507 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.803877 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.803943 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.803960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.803984 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.804001 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.906710 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.906760 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.906772 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.906790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:10 crc kubenswrapper[4904]: I1205 20:13:10.906805 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:10Z","lastTransitionTime":"2025-12-05T20:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.009850 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.009909 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.009926 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.009951 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.009969 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.113167 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.113250 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.113269 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.113294 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.113315 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.216764 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.216821 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.216839 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.216867 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.216889 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.320416 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.320471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.320487 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.320509 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.320527 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.424171 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.424233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.424256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.424283 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.424306 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.527629 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.527705 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.527729 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.527759 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.527779 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.630920 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.630989 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.631007 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.631199 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.631218 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.735322 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.743515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.743556 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.743566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.743587 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.743606 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.762592 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.777873 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.790086 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.801781 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.812081 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.822367 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.833657 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.846260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.846296 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.846307 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.846321 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.846331 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.847755 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.864217 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.876032 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.896774 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:13:05Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:13:05.358749 7009 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:13:05.358788 7009 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:13:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.912971 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.930912 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.944887 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.949010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.949075 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.949092 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.949119 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.949134 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:11Z","lastTransitionTime":"2025-12-05T20:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.961280 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:11 crc kubenswrapper[4904]: I1205 20:13:11.975980 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.052641 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.053709 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.053762 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.053794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.053904 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.156287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.156362 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.156388 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.156419 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.156441 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.258752 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.258820 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.258843 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.258872 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.258905 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.360605 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.360645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.360655 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.360671 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.360681 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.462800 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.462873 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.462900 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.462929 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.462952 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.566518 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.566593 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.566619 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.566650 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.566673 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.670435 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.670502 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.670531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.670560 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.670582 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.681273 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.681408 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:12 crc kubenswrapper[4904]: E1205 20:13:12.681599 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.681630 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.681671 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:12 crc kubenswrapper[4904]: E1205 20:13:12.681780 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:12 crc kubenswrapper[4904]: E1205 20:13:12.681954 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:12 crc kubenswrapper[4904]: E1205 20:13:12.682113 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.773038 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.773109 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.773121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.773140 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.773152 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.875987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.876044 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.876081 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.876102 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.876116 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.979591 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.979663 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.979683 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.979712 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:12 crc kubenswrapper[4904]: I1205 20:13:12.979735 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:12Z","lastTransitionTime":"2025-12-05T20:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.081987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.082051 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.082135 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.082169 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.082191 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.185457 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.185518 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.185541 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.185569 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.185590 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.293683 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.293765 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.293789 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.293821 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.293844 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.396691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.396747 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.396763 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.396786 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.396803 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.498862 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.498915 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.498923 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.498936 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.498945 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.601269 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.601320 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.601333 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.601353 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.601366 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.692236 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.704300 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.704346 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.704360 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.704379 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.704393 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.807455 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.807485 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.807518 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.807531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.807541 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.910881 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.910933 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.910956 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.910982 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:13 crc kubenswrapper[4904]: I1205 20:13:13.911000 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:13Z","lastTransitionTime":"2025-12-05T20:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.015039 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.015113 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.015128 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.015148 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.015209 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.118305 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.118357 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.118375 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.118398 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.118416 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.221637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.221719 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.221731 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.221770 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.221786 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.324981 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.325120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.325139 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.325162 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.325178 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.427444 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.427499 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.427549 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.427574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.427590 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.530291 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.530324 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.530331 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.530344 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.530354 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.632427 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.632469 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.632482 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.632501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.632512 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.681187 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.681234 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.681276 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:14 crc kubenswrapper[4904]: E1205 20:13:14.681317 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.681375 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:14 crc kubenswrapper[4904]: E1205 20:13:14.681474 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:14 crc kubenswrapper[4904]: E1205 20:13:14.681577 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:14 crc kubenswrapper[4904]: E1205 20:13:14.681707 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.735158 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.735198 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.735213 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.735229 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.735241 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.837740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.837780 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.837792 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.837807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.837819 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.940585 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.940641 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.940650 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.940664 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:14 crc kubenswrapper[4904]: I1205 20:13:14.940673 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:14Z","lastTransitionTime":"2025-12-05T20:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.042816 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.043205 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.043362 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.043510 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.043855 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.147810 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.147865 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.147884 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.147912 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.147931 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.250233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.250275 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.250286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.250302 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.250314 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.353423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.353476 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.353493 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.353518 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.353534 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.456557 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.456604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.456613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.456628 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.456643 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.559994 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.560119 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.560137 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.560162 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.560183 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.662893 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.662952 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.662967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.662984 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.662996 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.770570 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.770653 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.770680 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.770713 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.770749 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.874156 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.874225 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.874245 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.874270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.874288 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.977697 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.977751 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.977766 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.977786 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:15 crc kubenswrapper[4904]: I1205 20:13:15.977802 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:15Z","lastTransitionTime":"2025-12-05T20:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.081171 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.081281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.081305 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.081332 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.081352 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.184298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.184329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.184340 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.184368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.184381 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.286848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.286897 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.286915 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.286942 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.286963 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.389848 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.389900 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.389923 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.389953 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.389978 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.454175 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.454242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.454266 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.454298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.454328 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.473263 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.478677 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.478739 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.478756 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.478781 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.478798 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.499305 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.503692 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.503734 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.503750 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.503771 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.503787 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.523744 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.528707 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.528773 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.528797 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.528827 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.528849 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.546761 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.551398 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.551458 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.551482 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.551511 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.551533 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.597844 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.598106 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.600168 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.600230 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.600256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.600286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.600309 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.681213 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.681285 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.681349 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.681411 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.681502 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.681448 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.681652 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:16 crc kubenswrapper[4904]: E1205 20:13:16.681750 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.702894 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.702951 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.702969 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.702993 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.703012 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.805687 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.805757 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.805775 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.805807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.805826 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.908986 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.909096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.909123 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.909151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:16 crc kubenswrapper[4904]: I1205 20:13:16.909168 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:16Z","lastTransitionTime":"2025-12-05T20:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.012196 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.012259 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.012277 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.012305 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.012343 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.116002 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.116110 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.116137 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.116164 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.116182 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.218906 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.219010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.219124 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.219152 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.219171 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.322153 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.322228 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.322251 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.322281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.322303 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.425261 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.425329 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.425352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.425379 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.425402 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.527919 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.527948 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.527957 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.527969 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.527979 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.630864 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.630944 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.630963 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.630987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.631005 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.733218 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.733259 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.733267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.733281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.733290 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.835368 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.835443 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.835467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.835498 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.835521 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.938917 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.938987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.939010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.939041 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:17 crc kubenswrapper[4904]: I1205 20:13:17.939119 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:17Z","lastTransitionTime":"2025-12-05T20:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.042225 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.042273 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.042290 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.042315 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.042333 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.145198 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.145241 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.145252 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.145270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.145280 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.247631 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.247676 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.247687 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.247705 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.247715 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.350784 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.350834 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.350845 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.350861 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.350872 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.453755 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.453840 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.453860 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.453888 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.453910 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.557539 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.557568 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.557577 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.557592 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.557600 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.660961 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.661016 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.661029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.661050 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.661153 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.680744 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.680798 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.680889 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.681017 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:18 crc kubenswrapper[4904]: E1205 20:13:18.681220 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:18 crc kubenswrapper[4904]: E1205 20:13:18.681305 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:18 crc kubenswrapper[4904]: E1205 20:13:18.681504 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:18 crc kubenswrapper[4904]: E1205 20:13:18.682188 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.682634 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:13:18 crc kubenswrapper[4904]: E1205 20:13:18.682891 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.764278 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.764338 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.764432 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.764455 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.764467 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.867672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.867769 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.867793 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.867831 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.867852 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.971516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.971567 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.971580 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.971604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:18 crc kubenswrapper[4904]: I1205 20:13:18.971616 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:18Z","lastTransitionTime":"2025-12-05T20:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.074808 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.074847 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.074856 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.074874 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.074889 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.178710 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.178781 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.178805 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.178836 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.178856 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.282379 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.282451 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.282472 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.282507 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.282529 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.385744 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.385776 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.385786 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.385805 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.385818 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.475125 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:19 crc kubenswrapper[4904]: E1205 20:13:19.475400 4904 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:13:19 crc kubenswrapper[4904]: E1205 20:13:19.475524 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs podName:fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91 nodeName:}" failed. No retries permitted until 2025-12-05 20:14:23.475495164 +0000 UTC m=+162.286711313 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs") pod "network-metrics-daemon-d8xkk" (UID: "fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.488670 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.488818 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.488846 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.488875 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.488896 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.592186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.592263 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.592284 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.592306 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.592323 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.695123 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.695171 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.695187 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.695209 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.695226 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.797809 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.797854 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.797866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.797880 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.797892 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.900010 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.900139 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.900157 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.900183 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:19 crc kubenswrapper[4904]: I1205 20:13:19.900199 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:19Z","lastTransitionTime":"2025-12-05T20:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.002987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.003034 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.003045 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.003078 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.003097 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.106577 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.106645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.106661 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.106692 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.106710 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.209566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.209672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.209689 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.209710 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.209724 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.315433 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.315484 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.315495 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.315513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.315526 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.418211 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.418251 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.418260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.418273 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.418282 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.520919 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.521593 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.521691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.521793 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.521888 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.624167 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.624227 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.624245 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.624271 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.624292 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.680793 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.680805 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.680852 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.680828 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:20 crc kubenswrapper[4904]: E1205 20:13:20.681433 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:20 crc kubenswrapper[4904]: E1205 20:13:20.681796 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:20 crc kubenswrapper[4904]: E1205 20:13:20.681969 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:20 crc kubenswrapper[4904]: E1205 20:13:20.682882 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.727669 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.727726 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.727740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.727760 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.727777 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.830750 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.830796 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.830807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.830824 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.830835 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.934286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.934352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.934369 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.934394 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:20 crc kubenswrapper[4904]: I1205 20:13:20.934412 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:20Z","lastTransitionTime":"2025-12-05T20:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.037549 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.037621 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.037647 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.037677 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.037727 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.140413 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.140465 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.140480 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.140499 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.140515 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.244130 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.244215 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.244242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.244274 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.244297 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.347353 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.347401 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.347415 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.347433 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.347447 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.450478 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.450526 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.450537 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.450554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.450567 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.553040 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.553126 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.553144 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.553167 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.553184 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.655766 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.655809 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.655820 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.655878 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.655891 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.700111 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.720710 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gfzvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:12:48Z\\\",\\\"message\\\":\\\"2025-12-05T20:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336\\\\n2025-12-05T20:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ac1f239-7e8e-4064-b3f6-179ec832c336 to /host/opt/cni/bin/\\\\n2025-12-05T20:12:03Z [verbose] multus-daemon started\\\\n2025-12-05T20:12:03Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:12:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6gph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gfzvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.738925 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.755960 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8v9t9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78ec080e-d24e-458b-8622-465dd74773a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c927c7ee8c52a2f1bcca785070514c72b8eafa9cd7d4f58d77fd80f1dee6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj68j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8v9t9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.758185 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.758233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.758250 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.758274 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.758292 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.772269 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cc24b64-e25f-4b55-9123-295388685e7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa5a898deb51018e4a1196537853e1d8fda8121a2b87d6d3ebb4ab04ad6a0507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-476kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffd2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.799272 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-52vmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ef1f1c8-4ced-4af5-80b0-404e1f6f8796\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://140e572b1d15ec9aec68ea9d5b436e7ec2a1e886ed70b12c5ed011a12aa7646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d3391fbe15e6fb579ff87e3f77478481d5afc221f8d831870343f0611bc440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d4de09c32ad0a39c1af4ed1a33c7884e17394a487794a738a52bc8ca6c41b44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36fee8ff3492bd035b429ae0bf972aee65f4229764e591d2f8cee04c4ca6821a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93841abfc118f4431196949d70661aa628e9fba7624dde43138b4f24cbe2072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70d2e0be6c7f9323eb0e67dbd29decf95a1603e98c7b2246e97c6871e2310f2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292e9624aeb54e607935705202d1301522db8ee3045ae8dd92f26024ca952e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49m2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-52vmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.813791 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7fjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d8xkk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.832224 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf1f4c63-efa9-43dd-9fd4-56f617de9b74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc00b34dfd0d80ded26bfcc542e84550c50fe8dfabb2eab1f8b39c0402930ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeab9029a6d82f443afa458697942e9ce20465077f1adefc0123dfab2ea76a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22659d9c71fc60ad0348f1031cb52b72af9f93c69c11c04c140aece5e90efb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fce601ec5b7e54b14668b59d90721a107f392ed41a54f1a5b0446d48fbdb50f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.847936 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83c328b0-0fa7-480b-bf28-075f3ce5db83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36e445d3f09be60c5ad19cef1b1690e64bafa28df150dc57f95c520d57840e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edab3d954e07cd7d8c0a3ad643b636d5d1382b32f805b370cd79c8462d9f279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edab3d954e07cd7d8c0a3ad643b636d5d1382b32f805b370cd79c8462d9f279e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.860570 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.860637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.860654 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.860684 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.860703 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.870934 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"519e4f29-5762-4a86-9f31-eae681598fd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"(now=2025-12-05 20:12:01.575139975 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575358 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764965521\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764965521\\\\\\\\\\\\\\\" (2025-12-05 19:12:01 +0000 UTC to 2026-12-05 19:12:01 +0000 UTC (now=2025-12-05 20:12:01.57533306 +0000 UTC))\\\\\\\"\\\\nI1205 20:12:01.575383 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 20:12:01.575413 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 20:12:01.575451 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575485 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 20:12:01.575526 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-341598907/tls.crt::/tmp/serving-cert-341598907/tls.key\\\\\\\"\\\\nI1205 20:12:01.575666 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1205 20:12:01.575882 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 20:12:01.575895 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 20:12:01.576170 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 20:12:01.576169 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 20:12:01.576203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 20:12:01.576188 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.890200 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba4034f-d40d-4c5d-a52b-c9895a8a7f0b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://073f99fe786948da31c584468c3562984559623b590ba77c7925b79a1bb6d973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec935cce3c02816ac1f6162ca3a1701cd06dd930d3024ff958a4d92cf1dc0821\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736f67bbeab4ebc7e144d6d4af34487d18c3c8362425e45a9e0969b61d896666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.908820 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.925975 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5f4b5ef-d8d0-4591-a19c-a8347f74e833\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6624b0a035b3190da055f2ec82cb45b59268d035933912d9b87a20fd4c1cf128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b9a9e4321317cf95ec97bae86da81152ecba3d49e499d83e113d89e8b997247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9wkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t96qs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.947411 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89575271b3bd2ad2c040d9e6b00f221610af97e8dd8b120a2956b75f8bbdd237\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.960683 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aa73309acebd7dcba36d2990a8cf1f6ef1fbf78086a1455b158945063e1da3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b0deb0769ed0ae3221e161c0aab68192a5d5d1f511ccfaf8c6ba2f01334de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.964400 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.964461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.964481 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.964509 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.964529 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:21Z","lastTransitionTime":"2025-12-05T20:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.973182 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0577f397a9f6d93347cbc4c8693307a1df70636365bb481d910f29aa70a3147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:21 crc kubenswrapper[4904]: I1205 20:13:21.984154 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-67k68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ff8a0c-1191-4afd-8bc7-1b18fac7e568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cf77c02028027dc6d07f225e1f2352cd58a2a96280315e23f43b50e7c719cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzqtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-67k68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:21Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.004133 4904 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55fbdf03-712c-4abc-9847-225fe63052e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:13:05Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:13:05.358749 7009 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:13:05.358788 7009 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:13:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d7kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:12:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dsvd6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:22Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.068287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.068347 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.068364 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.068388 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.068405 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.170656 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.170699 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.170711 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.170728 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.170740 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.273089 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.273131 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.273144 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.273160 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.273172 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.376046 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.376111 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.376122 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.376138 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.376149 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.478742 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.478881 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.478908 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.478940 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.478964 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.581778 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.581823 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.581835 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.581877 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.581890 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.681283 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.681365 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.681377 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.681304 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:22 crc kubenswrapper[4904]: E1205 20:13:22.681673 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:22 crc kubenswrapper[4904]: E1205 20:13:22.681727 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:22 crc kubenswrapper[4904]: E1205 20:13:22.681471 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:22 crc kubenswrapper[4904]: E1205 20:13:22.681844 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.683810 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.683855 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.683866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.683918 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.683932 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.786864 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.786922 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.786940 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.786964 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.786980 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.889910 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.889991 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.890029 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.890099 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.890124 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.993815 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.993874 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.993894 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.993925 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:22 crc kubenswrapper[4904]: I1205 20:13:22.993950 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:22Z","lastTransitionTime":"2025-12-05T20:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.096600 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.096667 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.096686 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.096718 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.096739 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.199371 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.199412 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.199423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.199439 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.199452 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.302134 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.302175 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.302186 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.302204 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.302215 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.405132 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.405172 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.405180 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.405195 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.405205 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.507502 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.507574 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.507609 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.507653 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.507681 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.609895 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.609923 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.609931 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.609947 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.609963 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.712445 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.712522 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.712540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.712566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.712583 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.815442 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.815523 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.815545 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.815578 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.815602 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.918554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.918642 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.918665 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.918695 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:23 crc kubenswrapper[4904]: I1205 20:13:23.918713 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:23Z","lastTransitionTime":"2025-12-05T20:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.021815 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.021875 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.021913 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.021955 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.021977 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.125953 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.125999 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.126008 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.126022 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.126033 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.228807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.228855 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.228866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.228884 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.228897 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.330989 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.331051 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.331098 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.331127 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.331145 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.434254 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.434334 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.434359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.434385 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.434406 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.537203 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.537249 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.537259 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.537272 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.537281 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.640267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.640330 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.640349 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.640373 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.640391 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.680946 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.680977 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.680977 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.681048 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:24 crc kubenswrapper[4904]: E1205 20:13:24.681212 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:24 crc kubenswrapper[4904]: E1205 20:13:24.681341 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:24 crc kubenswrapper[4904]: E1205 20:13:24.681588 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:24 crc kubenswrapper[4904]: E1205 20:13:24.681705 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.743775 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.743842 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.743859 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.743883 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.743901 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.846903 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.846945 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.846957 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.846975 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.846986 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.950340 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.950404 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.950427 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.950454 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:24 crc kubenswrapper[4904]: I1205 20:13:24.950480 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:24Z","lastTransitionTime":"2025-12-05T20:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.052745 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.052812 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.052828 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.052851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.052867 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.156231 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.156288 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.156305 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.156327 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.156343 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.258807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.258855 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.258866 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.258880 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.258890 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.361691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.361735 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.361743 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.361759 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.361768 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.464231 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.464382 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.464411 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.464443 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.464468 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.568324 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.568392 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.568418 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.568452 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.568476 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.670471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.670505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.670514 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.670527 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.670537 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.772216 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.772249 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.772256 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.772269 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.772280 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.875665 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.875763 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.875790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.875819 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.875841 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.978602 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.978661 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.978675 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.978697 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:25 crc kubenswrapper[4904]: I1205 20:13:25.978711 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:25Z","lastTransitionTime":"2025-12-05T20:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.081102 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.081136 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.081146 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.081159 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.081169 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.182519 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.182572 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.182588 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.182608 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.182624 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.285259 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.285296 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.285336 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.285352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.285365 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.387742 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.387793 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.387805 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.387823 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.387836 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.491050 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.491127 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.491141 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.491157 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.491167 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.593163 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.593244 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.593267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.593295 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.593315 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.680668 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.680682 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.680691 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.680711 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.681200 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.681282 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.681303 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.681358 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.696436 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.696494 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.696515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.696543 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.696565 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.705643 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.785634 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.785720 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.785741 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.785843 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.785888 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.804303 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.808621 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.808650 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.808661 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.808677 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.808689 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.828806 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.833611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.833643 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.833654 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.833671 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.833682 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.854538 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.859672 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.859701 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.859712 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.859727 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.859738 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.881586 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.886880 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.886957 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.886969 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.886994 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.887007 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.906843 4904 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:13:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"54b04eb7-ddef-4ce3-9daa-6d051611390c\\\",\\\"systemUUID\\\":\\\"cfb4a8d1-bf0d-4f7a-95e4-63a42b6fb559\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:13:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:13:26 crc kubenswrapper[4904]: E1205 20:13:26.907010 4904 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.909546 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.909598 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.909610 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.909635 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:26 crc kubenswrapper[4904]: I1205 20:13:26.909651 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:26Z","lastTransitionTime":"2025-12-05T20:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.013171 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.013238 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.013257 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.013281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.013299 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.116520 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.116594 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.116613 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.116639 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.116657 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.220702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.220773 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.220786 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.220808 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.220826 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.323699 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.323767 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.323789 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.323813 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.323829 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.425967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.426012 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.426023 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.426045 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.426100 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.528319 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.528359 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.528367 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.528381 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.528392 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.630413 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.630465 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.630493 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.630513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.630525 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.733441 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.733490 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.733501 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.733517 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.733530 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.835832 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.835886 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.835898 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.835916 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.835930 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.938395 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.938471 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.938492 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.938516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:27 crc kubenswrapper[4904]: I1205 20:13:27.938532 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:27Z","lastTransitionTime":"2025-12-05T20:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.041593 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.041641 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.041652 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.041669 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.041680 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.144522 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.144597 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.144619 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.144646 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.144664 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.247991 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.248052 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.248129 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.248158 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.248178 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.350140 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.350225 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.350253 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.350284 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.350308 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.453788 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.453888 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.453909 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.453936 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.453954 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.556703 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.556768 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.556784 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.556804 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.556820 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.659482 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.659530 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.659545 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.659568 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.659584 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.680863 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:28 crc kubenswrapper[4904]: E1205 20:13:28.680990 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.681049 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:28 crc kubenswrapper[4904]: E1205 20:13:28.681127 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.681167 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:28 crc kubenswrapper[4904]: E1205 20:13:28.681209 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.681352 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:28 crc kubenswrapper[4904]: E1205 20:13:28.681542 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.762143 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.762201 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.762218 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.762242 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.762259 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.865000 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.865051 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.865108 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.865149 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.865171 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.967504 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.967540 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.967551 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.967566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:28 crc kubenswrapper[4904]: I1205 20:13:28.967576 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:28Z","lastTransitionTime":"2025-12-05T20:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.069802 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.069914 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.069934 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.069955 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.069971 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.172899 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.172967 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.172984 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.173017 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.173037 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.275210 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.275254 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.275263 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.275279 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.275290 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.377527 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.377596 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.377614 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.377640 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.377658 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.480391 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.480464 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.480489 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.480566 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.480592 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.584350 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.584426 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.584461 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.584497 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.584514 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.687227 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.687260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.687270 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.687286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.687297 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.789516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.789565 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.789581 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.789604 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.789626 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.891935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.892240 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.892352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.892505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.892607 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.995571 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.995608 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.995616 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.995645 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:29 crc kubenswrapper[4904]: I1205 20:13:29.995657 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:29Z","lastTransitionTime":"2025-12-05T20:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.097718 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.097794 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.097819 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.097853 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.097877 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.200230 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.200260 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.200268 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.200281 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.200290 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.302731 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.302772 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.302785 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.302801 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.302814 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.405936 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.406019 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.406047 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.406136 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.406152 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.509599 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.509669 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.509697 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.509726 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.509746 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.611958 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.612011 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.612027 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.612048 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.612093 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.680284 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.680320 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.680353 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.680374 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:30 crc kubenswrapper[4904]: E1205 20:13:30.680460 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:30 crc kubenswrapper[4904]: E1205 20:13:30.680544 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:30 crc kubenswrapper[4904]: E1205 20:13:30.680630 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:30 crc kubenswrapper[4904]: E1205 20:13:30.680720 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.715232 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.715307 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.715320 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.715342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.715356 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.818230 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.818287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.818304 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.818326 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.818343 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.920935 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.920977 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.920987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.921004 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:30 crc kubenswrapper[4904]: I1205 20:13:30.921015 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:30Z","lastTransitionTime":"2025-12-05T20:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.024342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.024430 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.024452 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.024484 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.024510 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.127611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.127674 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.127715 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.127752 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.127776 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.230244 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.230308 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.230327 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.230352 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.230370 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.333423 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.333475 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.333492 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.333513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.333532 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.436477 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.436536 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.436554 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.436576 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.436594 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.539551 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.539915 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.540096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.540267 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.540409 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.643578 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.643918 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.644033 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.644170 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.644273 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.746944 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.746976 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.746987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.747001 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.747012 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.763054 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-67k68" podStartSLOduration=90.763031499 podStartE2EDuration="1m30.763031499s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.763008098 +0000 UTC m=+110.574224247" watchObservedRunningTime="2025-12-05 20:13:31.763031499 +0000 UTC m=+110.574247618" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.810451 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=5.810429578 podStartE2EDuration="5.810429578s" podCreationTimestamp="2025-12-05 20:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.810367556 +0000 UTC m=+110.621583685" watchObservedRunningTime="2025-12-05 20:13:31.810429578 +0000 UTC m=+110.621645687" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.843499 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gfzvv" podStartSLOduration=90.843480372 podStartE2EDuration="1m30.843480372s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.842475064 +0000 UTC m=+110.653691183" watchObservedRunningTime="2025-12-05 20:13:31.843480372 +0000 UTC m=+110.654696491" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.849092 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.849120 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.849133 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.849148 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.849159 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.879290 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8v9t9" podStartSLOduration=90.879268854 podStartE2EDuration="1m30.879268854s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.868557581 +0000 UTC m=+110.679773690" watchObservedRunningTime="2025-12-05 20:13:31.879268854 +0000 UTC m=+110.690484983" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.896437 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podStartSLOduration=90.896418139 podStartE2EDuration="1m30.896418139s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.879977354 +0000 UTC m=+110.691193483" watchObservedRunningTime="2025-12-05 20:13:31.896418139 +0000 UTC m=+110.707634248" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.910389 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-52vmw" podStartSLOduration=90.910367453 podStartE2EDuration="1m30.910367453s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.897235072 +0000 UTC m=+110.708451191" watchObservedRunningTime="2025-12-05 20:13:31.910367453 +0000 UTC m=+110.721583562" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.939317 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.939299151 podStartE2EDuration="1m3.939299151s" podCreationTimestamp="2025-12-05 20:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.92969459 +0000 UTC m=+110.740910699" watchObservedRunningTime="2025-12-05 20:13:31.939299151 +0000 UTC m=+110.750515260" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.950620 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.950643 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.950651 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.950663 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.950672 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:31Z","lastTransitionTime":"2025-12-05T20:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.955722 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.955709515 podStartE2EDuration="18.955709515s" podCreationTimestamp="2025-12-05 20:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.939666672 +0000 UTC m=+110.750882781" watchObservedRunningTime="2025-12-05 20:13:31.955709515 +0000 UTC m=+110.766925624" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.970628 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.970612357 podStartE2EDuration="1m30.970612357s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.956434066 +0000 UTC m=+110.767650185" watchObservedRunningTime="2025-12-05 20:13:31.970612357 +0000 UTC m=+110.781828466" Dec 05 20:13:31 crc kubenswrapper[4904]: I1205 20:13:31.983030 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.983012727 podStartE2EDuration="1m23.983012727s" podCreationTimestamp="2025-12-05 20:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.971023908 +0000 UTC m=+110.782240017" watchObservedRunningTime="2025-12-05 20:13:31.983012727 +0000 UTC m=+110.794228836" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.053981 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.054017 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.054028 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.054043 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.054073 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.157351 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.157394 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.157403 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.157418 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.157427 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.260017 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.260075 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.260085 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.260101 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.260112 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.364906 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.364955 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.364965 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.364983 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.364994 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.467993 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.468095 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.468117 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.468146 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.468164 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.572324 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.572822 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.572904 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.572968 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.572981 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.676391 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.676448 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.676465 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.676488 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.676505 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.680666 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.680743 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.680700 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:32 crc kubenswrapper[4904]: E1205 20:13:32.680872 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:32 crc kubenswrapper[4904]: E1205 20:13:32.680996 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:32 crc kubenswrapper[4904]: E1205 20:13:32.681183 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.681591 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:32 crc kubenswrapper[4904]: E1205 20:13:32.681939 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.779534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.779602 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.779627 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.779657 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.779679 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.883001 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.884157 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.884486 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.884632 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.884759 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.988532 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.988597 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.988621 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.988651 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:32 crc kubenswrapper[4904]: I1205 20:13:32.988675 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:32Z","lastTransitionTime":"2025-12-05T20:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.092516 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.092578 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.092601 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.092629 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.092651 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.195601 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.195664 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.195683 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.195709 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.195727 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.298284 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.298350 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.298369 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.298396 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.298415 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.401172 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.401222 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.401233 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.401251 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.401262 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.504418 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.504505 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.504532 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.504565 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.504590 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.607515 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.607549 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.607558 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.607573 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.607586 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.682543 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:13:33 crc kubenswrapper[4904]: E1205 20:13:33.682849 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dsvd6_openshift-ovn-kubernetes(55fbdf03-712c-4abc-9847-225fe63052e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.710668 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.710738 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.710759 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.710788 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.710811 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.814571 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.814814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.814851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.814881 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.814910 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.918250 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.918287 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.918298 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.918313 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:33 crc kubenswrapper[4904]: I1205 20:13:33.918324 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:33Z","lastTransitionTime":"2025-12-05T20:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.020740 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.020790 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.020803 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.020819 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.020829 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.123987 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.124037 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.124054 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.124106 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.124123 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.226841 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.226913 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.226975 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.227005 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.227045 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.329934 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.330005 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.330030 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.330090 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.330109 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.433342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.433418 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.433435 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.433467 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.433487 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.536419 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.536495 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.536513 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.536538 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.536557 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.639814 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.640208 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.640226 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.640249 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.640266 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.680879 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.680948 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.680904 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.680886 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:34 crc kubenswrapper[4904]: E1205 20:13:34.681028 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:34 crc kubenswrapper[4904]: E1205 20:13:34.681210 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:34 crc kubenswrapper[4904]: E1205 20:13:34.681391 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:34 crc kubenswrapper[4904]: E1205 20:13:34.681506 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.743605 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.743661 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.743677 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.743702 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.743720 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.847207 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.847301 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.847325 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.847355 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.847383 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.950556 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.950622 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.950643 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.950691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:34 crc kubenswrapper[4904]: I1205 20:13:34.950721 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:34Z","lastTransitionTime":"2025-12-05T20:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.054450 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.054531 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.054556 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.054590 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.054611 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.160807 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.160871 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.160884 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.160902 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.160914 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.264316 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.264377 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.264394 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.264420 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.264437 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.366625 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.366664 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.366676 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.366691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.366704 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.368905 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/1.log" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.369316 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/0.log" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.369357 4904 generic.go:334] "Generic (PLEG): container finished" podID="5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea" containerID="7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d" exitCode=1 Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.369386 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerDied","Data":"7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.369418 4904 scope.go:117] "RemoveContainer" containerID="6e5e16893370ed8543ecf33c4b2373bda50a424810470178390a553f2755323d" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.369810 4904 scope.go:117] "RemoveContainer" containerID="7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d" Dec 05 20:13:35 crc kubenswrapper[4904]: E1205 20:13:35.369968 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gfzvv_openshift-multus(5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea)\"" pod="openshift-multus/multus-gfzvv" podUID="5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.396421 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t96qs" podStartSLOduration=93.396400244 podStartE2EDuration="1m33.396400244s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:31.994232834 +0000 UTC m=+110.805448943" watchObservedRunningTime="2025-12-05 20:13:35.396400244 +0000 UTC m=+114.207616353" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.469888 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.469946 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.469964 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.469986 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.470008 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.572851 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.572883 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.572893 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.572907 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.572918 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.675497 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.675534 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.675547 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.675563 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.675574 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.776896 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.776932 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.776942 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.776956 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.776966 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.878697 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.878722 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.878731 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.878743 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.878753 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.981071 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.981105 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.981116 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.981129 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:35 crc kubenswrapper[4904]: I1205 20:13:35.981140 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:35Z","lastTransitionTime":"2025-12-05T20:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.083517 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.083586 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.083607 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.083637 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.083655 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.185907 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.185960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.185972 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.185991 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.186004 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.287883 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.287928 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.287943 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.287960 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.287970 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.374628 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/1.log" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.390611 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.390658 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.390671 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.390691 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.390707 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.493624 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.493669 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.493683 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.493700 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.493712 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.596111 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.596151 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.596162 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.596178 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.596190 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.703522 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:36 crc kubenswrapper[4904]: E1205 20:13:36.703678 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.703903 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:36 crc kubenswrapper[4904]: E1205 20:13:36.703967 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.704121 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:36 crc kubenswrapper[4904]: E1205 20:13:36.704179 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.704661 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:36 crc kubenswrapper[4904]: E1205 20:13:36.705699 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.707014 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.707096 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.707107 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.707121 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.707133 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.810199 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.810249 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.810265 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.810286 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.810305 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.912496 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.912559 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.912572 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.912589 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.912601 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.967276 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.967342 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.967357 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.967378 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:13:36 crc kubenswrapper[4904]: I1205 20:13:36.967394 4904 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:13:36Z","lastTransitionTime":"2025-12-05T20:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.019281 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m"] Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.019660 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.021336 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.021607 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.021781 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.023003 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.108803 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.108862 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.109150 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.109237 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.109286 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.209912 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.209969 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.209987 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.210006 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.210031 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.210053 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.210356 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.211365 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.221417 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.239255 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed897bd9-654c-4c4a-ad8d-575a2f4fddb7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cg64m\" (UID: \"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.342571 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" Dec 05 20:13:37 crc kubenswrapper[4904]: I1205 20:13:37.380719 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" event={"ID":"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7","Type":"ContainerStarted","Data":"2fe94ee003d55fd64e7197c6462225f07604f6687e474dd45b66a461316aae02"} Dec 05 20:13:38 crc kubenswrapper[4904]: I1205 20:13:38.386437 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" event={"ID":"ed897bd9-654c-4c4a-ad8d-575a2f4fddb7","Type":"ContainerStarted","Data":"f4d7b50a1d25b676ddeeea1c343f6ef05b39a3d4dada7c06c447b90e1d559213"} Dec 05 20:13:38 crc kubenswrapper[4904]: I1205 20:13:38.680764 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:38 crc kubenswrapper[4904]: I1205 20:13:38.680861 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:38 crc kubenswrapper[4904]: I1205 20:13:38.681463 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:38 crc kubenswrapper[4904]: I1205 20:13:38.681734 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:38 crc kubenswrapper[4904]: E1205 20:13:38.681972 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:38 crc kubenswrapper[4904]: E1205 20:13:38.681730 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:38 crc kubenswrapper[4904]: E1205 20:13:38.682450 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:38 crc kubenswrapper[4904]: E1205 20:13:38.682625 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:40 crc kubenswrapper[4904]: I1205 20:13:40.680947 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:40 crc kubenswrapper[4904]: I1205 20:13:40.681033 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:40 crc kubenswrapper[4904]: I1205 20:13:40.680971 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:40 crc kubenswrapper[4904]: I1205 20:13:40.680965 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:40 crc kubenswrapper[4904]: E1205 20:13:40.681189 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:40 crc kubenswrapper[4904]: E1205 20:13:40.681327 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:40 crc kubenswrapper[4904]: E1205 20:13:40.681494 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:40 crc kubenswrapper[4904]: E1205 20:13:40.681578 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:41 crc kubenswrapper[4904]: E1205 20:13:41.670150 4904 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 20:13:41 crc kubenswrapper[4904]: E1205 20:13:41.743835 4904 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:13:42 crc kubenswrapper[4904]: I1205 20:13:42.681026 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:42 crc kubenswrapper[4904]: I1205 20:13:42.681175 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:42 crc kubenswrapper[4904]: E1205 20:13:42.681232 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:42 crc kubenswrapper[4904]: I1205 20:13:42.681180 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:42 crc kubenswrapper[4904]: I1205 20:13:42.681292 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:42 crc kubenswrapper[4904]: E1205 20:13:42.681491 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:42 crc kubenswrapper[4904]: E1205 20:13:42.681578 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:42 crc kubenswrapper[4904]: E1205 20:13:42.681723 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:44 crc kubenswrapper[4904]: I1205 20:13:44.680630 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:44 crc kubenswrapper[4904]: I1205 20:13:44.680703 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:44 crc kubenswrapper[4904]: I1205 20:13:44.680631 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:44 crc kubenswrapper[4904]: E1205 20:13:44.680782 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:44 crc kubenswrapper[4904]: E1205 20:13:44.680915 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:44 crc kubenswrapper[4904]: I1205 20:13:44.680962 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:44 crc kubenswrapper[4904]: E1205 20:13:44.680998 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:44 crc kubenswrapper[4904]: E1205 20:13:44.681205 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:46 crc kubenswrapper[4904]: I1205 20:13:46.680937 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:46 crc kubenswrapper[4904]: I1205 20:13:46.680939 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:46 crc kubenswrapper[4904]: I1205 20:13:46.680948 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:46 crc kubenswrapper[4904]: I1205 20:13:46.681147 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:46 crc kubenswrapper[4904]: E1205 20:13:46.681273 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:46 crc kubenswrapper[4904]: E1205 20:13:46.681437 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:46 crc kubenswrapper[4904]: E1205 20:13:46.681556 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:46 crc kubenswrapper[4904]: I1205 20:13:46.681581 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:13:46 crc kubenswrapper[4904]: E1205 20:13:46.681595 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:46 crc kubenswrapper[4904]: E1205 20:13:46.745265 4904 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.419044 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/3.log" Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.421706 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerStarted","Data":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.422530 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.447779 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cg64m" podStartSLOduration=106.447760272 podStartE2EDuration="1m46.447760272s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:38.408823956 +0000 UTC m=+117.220040105" watchObservedRunningTime="2025-12-05 20:13:47.447760272 +0000 UTC m=+126.258976381" Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.447947 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d8xkk"] Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.448046 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:47 crc kubenswrapper[4904]: E1205 20:13:47.448152 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:47 crc kubenswrapper[4904]: I1205 20:13:47.456518 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podStartSLOduration=106.45649834 podStartE2EDuration="1m46.45649834s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:47.456243422 +0000 UTC m=+126.267459581" watchObservedRunningTime="2025-12-05 20:13:47.45649834 +0000 UTC m=+126.267714449" Dec 05 20:13:48 crc kubenswrapper[4904]: I1205 20:13:48.680699 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:48 crc kubenswrapper[4904]: I1205 20:13:48.680724 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:48 crc kubenswrapper[4904]: I1205 20:13:48.680978 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:48 crc kubenswrapper[4904]: I1205 20:13:48.681218 4904 scope.go:117] "RemoveContainer" containerID="7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d" Dec 05 20:13:48 crc kubenswrapper[4904]: I1205 20:13:48.681453 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:48 crc kubenswrapper[4904]: E1205 20:13:48.681620 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:48 crc kubenswrapper[4904]: E1205 20:13:48.681668 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:48 crc kubenswrapper[4904]: E1205 20:13:48.681740 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:48 crc kubenswrapper[4904]: E1205 20:13:48.681812 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:49 crc kubenswrapper[4904]: I1205 20:13:49.431656 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/1.log" Dec 05 20:13:49 crc kubenswrapper[4904]: I1205 20:13:49.431707 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerStarted","Data":"a724f454d77b9b67af0b65f96e13eb70ffb479606ba3d7aa571916c68b1e2f03"} Dec 05 20:13:50 crc kubenswrapper[4904]: I1205 20:13:50.680829 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:50 crc kubenswrapper[4904]: E1205 20:13:50.681803 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:13:50 crc kubenswrapper[4904]: I1205 20:13:50.681395 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:50 crc kubenswrapper[4904]: E1205 20:13:50.682049 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:13:50 crc kubenswrapper[4904]: I1205 20:13:50.681502 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:50 crc kubenswrapper[4904]: I1205 20:13:50.681346 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:50 crc kubenswrapper[4904]: E1205 20:13:50.682322 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:13:50 crc kubenswrapper[4904]: E1205 20:13:50.682504 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d8xkk" podUID="fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.681293 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.681473 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.681309 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.682124 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.685761 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.685891 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.686458 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.686691 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.686469 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:13:52 crc kubenswrapper[4904]: I1205 20:13:52.686495 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.165538 4904 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.208035 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l928g"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.208597 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.209914 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cncqb"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.210566 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.212509 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.213380 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.214228 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.214654 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.215015 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.217795 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.218333 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.218499 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.218554 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.219043 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.219406 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.220375 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.220414 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.220476 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.220697 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.220835 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.221259 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.221283 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.221336 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.221720 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4bbwg"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.222017 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.222086 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.223565 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-484jg"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.224272 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pdnb5"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.224743 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.225086 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.224763 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.225690 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.226885 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.232128 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.232560 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.232911 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233090 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233213 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233245 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233244 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233260 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233361 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233586 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.233686 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.234141 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.234500 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.235082 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.235201 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.235294 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.235311 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.235694 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.235853 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.236051 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.236189 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.236298 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.236415 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.236537 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.236653 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.239376 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.253120 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.258833 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8p482"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.272783 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.277977 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.280570 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.280907 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.281273 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.281516 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.281640 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bvmh2"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.282472 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.282713 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.285952 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.287282 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.287509 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.287527 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.287726 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.290001 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.290741 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.291163 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.291328 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.291541 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.292820 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.292958 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293140 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293273 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293383 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293419 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293622 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293751 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293982 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294027 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294146 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.293992 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294142 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294194 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294421 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294455 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294468 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294501 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294506 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294516 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294535 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.295882 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294539 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294577 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.296022 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.296038 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.296217 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.296346 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.296395 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294586 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294617 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294660 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294713 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.294766 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.295015 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.295118 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.298398 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.298816 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-672dp"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.299408 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.299741 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.300093 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.300285 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.300640 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.301348 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tkgxl"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.301829 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.301961 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302039 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302048 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302251 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302459 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302582 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302603 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.302692 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.306041 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.306306 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.306455 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.307031 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.312443 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-479fl"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.313549 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.314344 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.321345 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hdk86"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.321898 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.322232 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.335929 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-serving-cert\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.335970 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcw4c\" (UniqueName: \"kubernetes.io/projected/834a438d-9f7d-4707-b790-8ce136081f7c-kube-api-access-gcw4c\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.335992 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-node-pullsecrets\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.336011 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kc7\" (UniqueName: \"kubernetes.io/projected/9032a5cf-b023-4e91-bc39-52fdd93472ca-kube-api-access-j7kc7\") pod \"cluster-samples-operator-665b6dd947-9rm4d\" (UID: \"9032a5cf-b023-4e91-bc39-52fdd93472ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.338746 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339045 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-machine-approver-tls\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339139 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-config\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339176 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qntm8\" (UniqueName: \"kubernetes.io/projected/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-kube-api-access-qntm8\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339202 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-client-ca\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339252 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-audit\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339294 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0af025-acfb-4cfa-b413-6159067b8269-serving-cert\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339321 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954b7859-3b75-48f2-b1ed-d70932da35ba-serving-cert\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339341 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dm6j\" (UniqueName: \"kubernetes.io/projected/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-kube-api-access-9dm6j\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339371 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-audit-dir\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339398 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ca5edb-7664-4e63-a9e8-46f270623ad2-config\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339422 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339439 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb68w\" (UniqueName: \"kubernetes.io/projected/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-kube-api-access-zb68w\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339464 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339485 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-config\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339507 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmpq6\" (UniqueName: \"kubernetes.io/projected/11ca5edb-7664-4e63-a9e8-46f270623ad2-kube-api-access-rmpq6\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339528 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9032a5cf-b023-4e91-bc39-52fdd93472ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9rm4d\" (UID: \"9032a5cf-b023-4e91-bc39-52fdd93472ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339548 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkvt\" (UniqueName: \"kubernetes.io/projected/954b7859-3b75-48f2-b1ed-d70932da35ba-kube-api-access-cfkvt\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339580 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-config\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339611 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9mrn\" (UniqueName: \"kubernetes.io/projected/214b71b9-85d3-4be2-be71-311e9097f89b-kube-api-access-h9mrn\") pod \"dns-operator-744455d44c-484jg\" (UID: \"214b71b9-85d3-4be2-be71-311e9097f89b\") " pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339644 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-trusted-ca-bundle\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339662 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-image-import-ca\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339683 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-oauth-config\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339703 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-auth-proxy-config\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339751 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-encryption-config\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339783 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339804 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-serving-cert\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339827 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhnpg\" (UniqueName: \"kubernetes.io/projected/eb4b4c10-971e-4766-b632-9f710ec547a6-kube-api-access-fhnpg\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339849 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-config\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339869 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-service-ca\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339885 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/11ca5edb-7664-4e63-a9e8-46f270623ad2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339904 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-serving-cert\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339924 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339946 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-etcd-client\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339964 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwfp\" (UniqueName: \"kubernetes.io/projected/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-kube-api-access-wcwfp\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.339987 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-serving-cert\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340008 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-encryption-config\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340030 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340075 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-oauth-serving-cert\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjcj\" (UniqueName: \"kubernetes.io/projected/6f0af025-acfb-4cfa-b413-6159067b8269-kube-api-access-wbjcj\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340132 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-etcd-client\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340161 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvzp\" (UniqueName: \"kubernetes.io/projected/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-kube-api-access-jjvzp\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340189 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-config\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340410 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-audit-policies\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340435 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-client-ca\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340457 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckplz\" (UniqueName: \"kubernetes.io/projected/b02e39c5-31b4-4444-a500-cd7cbe327bec-kube-api-access-ckplz\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340472 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340506 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/11ca5edb-7664-4e63-a9e8-46f270623ad2-images\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340522 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-trusted-ca\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340541 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/834a438d-9f7d-4707-b790-8ce136081f7c-serving-cert\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340566 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/214b71b9-85d3-4be2-be71-311e9097f89b-metrics-tls\") pod \"dns-operator-744455d44c-484jg\" (UID: \"214b71b9-85d3-4be2-be71-311e9097f89b\") " pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340587 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/954b7859-3b75-48f2-b1ed-d70932da35ba-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340608 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340623 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-client\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340661 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b4c10-971e-4766-b632-9f710ec547a6-serving-cert\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340684 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-ca\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340773 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-service-ca\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-config\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340815 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-config\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.340883 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-audit-dir\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.342305 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.343562 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.345701 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpnlq"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.347990 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.356170 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.356805 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.357699 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5srbq"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.358269 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.358325 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.359038 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.359392 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.359970 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zq5jf"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.360300 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.361208 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.361543 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.363043 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.363741 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.363940 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qmfqn"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.364486 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.366175 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.366744 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.367081 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.367767 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.368442 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.368999 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.370554 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jw92f"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.371103 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.371577 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.371723 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s87dr"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.372500 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.377104 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.377851 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.378016 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.379188 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.381210 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.381936 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cncqb"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.382024 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.382601 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.385757 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.388531 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l928g"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.389911 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.391401 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.391575 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.392727 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4bbwg"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.394004 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c67m2"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.394636 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.395407 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.398034 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.399279 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8p482"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.400566 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-672dp"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.402073 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.403393 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.404705 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pdnb5"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.406255 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpnlq"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.407664 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bvmh2"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.408963 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5srbq"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.410565 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.411493 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-484jg"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.413164 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.413781 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.415036 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.416099 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.426713 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zq5jf"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.428845 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-brbpq"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.429711 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.430423 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.430525 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.433726 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c67m2"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.435080 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-479fl"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.437421 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.439118 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.440107 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jw92f"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.441864 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443684 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-stats-auth\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443712 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/95df99ca-9a0a-465c-96b6-968c0a01235a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qmfqn\" (UID: \"95df99ca-9a0a-465c-96b6-968c0a01235a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443736 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/214b71b9-85d3-4be2-be71-311e9097f89b-metrics-tls\") pod \"dns-operator-744455d44c-484jg\" (UID: \"214b71b9-85d3-4be2-be71-311e9097f89b\") " pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443752 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/954b7859-3b75-48f2-b1ed-d70932da35ba-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443770 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rf5\" (UniqueName: \"kubernetes.io/projected/93be2498-3695-49d6-8bf4-ebea6010e925-kube-api-access-w5rf5\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443787 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cb54914-08dc-4044-af38-9375669d5a2a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443821 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-client\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443836 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b4c10-971e-4766-b632-9f710ec547a6-serving-cert\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443851 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-ca\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443871 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-service-ca\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443887 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55d8\" (UniqueName: \"kubernetes.io/projected/4589bc8e-2864-4d7e-a049-a3bf264bb997-kube-api-access-f55d8\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443902 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-config\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443917 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-config\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443933 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbtm\" (UniqueName: \"kubernetes.io/projected/95df99ca-9a0a-465c-96b6-968c0a01235a-kube-api-access-rzbtm\") pod \"multus-admission-controller-857f4d67dd-qmfqn\" (UID: \"95df99ca-9a0a-465c-96b6-968c0a01235a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443953 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-audit-dir\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443969 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-serving-cert\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443983 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcw4c\" (UniqueName: \"kubernetes.io/projected/834a438d-9f7d-4707-b790-8ce136081f7c-kube-api-access-gcw4c\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.443998 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-node-pullsecrets\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444013 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kc7\" (UniqueName: \"kubernetes.io/projected/9032a5cf-b023-4e91-bc39-52fdd93472ca-kube-api-access-j7kc7\") pod \"cluster-samples-operator-665b6dd947-9rm4d\" (UID: \"9032a5cf-b023-4e91-bc39-52fdd93472ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444029 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-machine-approver-tls\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444044 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-config\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444076 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f64ln\" (UniqueName: \"kubernetes.io/projected/6303238b-f708-4f64-bacb-6a68e7a14425-kube-api-access-f64ln\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444094 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qntm8\" (UniqueName: \"kubernetes.io/projected/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-kube-api-access-qntm8\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444109 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-client-ca\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444151 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-audit\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444180 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6303238b-f708-4f64-bacb-6a68e7a14425-srv-cert\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444210 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dm6j\" (UniqueName: \"kubernetes.io/projected/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-kube-api-access-9dm6j\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444239 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0af025-acfb-4cfa-b413-6159067b8269-serving-cert\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444260 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954b7859-3b75-48f2-b1ed-d70932da35ba-serving-cert\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444280 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93be2498-3695-49d6-8bf4-ebea6010e925-metrics-tls\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444296 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93be2498-3695-49d6-8bf4-ebea6010e925-trusted-ca\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444312 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-audit-dir\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444328 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ca5edb-7664-4e63-a9e8-46f270623ad2-config\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444344 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-config\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444366 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444383 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb68w\" (UniqueName: \"kubernetes.io/projected/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-kube-api-access-zb68w\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444400 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-config\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444416 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444434 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmpq6\" (UniqueName: \"kubernetes.io/projected/11ca5edb-7664-4e63-a9e8-46f270623ad2-kube-api-access-rmpq6\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444450 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9032a5cf-b023-4e91-bc39-52fdd93472ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9rm4d\" (UID: \"9032a5cf-b023-4e91-bc39-52fdd93472ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444466 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkvt\" (UniqueName: \"kubernetes.io/projected/954b7859-3b75-48f2-b1ed-d70932da35ba-kube-api-access-cfkvt\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444481 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-config\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444495 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9mrn\" (UniqueName: \"kubernetes.io/projected/214b71b9-85d3-4be2-be71-311e9097f89b-kube-api-access-h9mrn\") pod \"dns-operator-744455d44c-484jg\" (UID: \"214b71b9-85d3-4be2-be71-311e9097f89b\") " pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444510 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-oauth-config\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444537 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-trusted-ca-bundle\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444551 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-image-import-ca\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444568 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bpx\" (UniqueName: \"kubernetes.io/projected/67c5324d-bb68-4bbc-a0f6-e0452d6f4155-kube-api-access-z5bpx\") pod \"migrator-59844c95c7-fzxff\" (UID: \"67c5324d-bb68-4bbc-a0f6-e0452d6f4155\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444585 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-auth-proxy-config\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444600 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444615 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-encryption-config\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444631 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444646 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-serving-cert\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444660 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhnpg\" (UniqueName: \"kubernetes.io/projected/eb4b4c10-971e-4766-b632-9f710ec547a6-kube-api-access-fhnpg\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444674 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-config\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444687 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-service-ca\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444702 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/11ca5edb-7664-4e63-a9e8-46f270623ad2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444716 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-serving-cert\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444731 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444747 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwfp\" (UniqueName: \"kubernetes.io/projected/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-kube-api-access-wcwfp\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444761 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-etcd-client\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444776 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-serving-cert\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444792 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93be2498-3695-49d6-8bf4-ebea6010e925-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444808 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444823 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-encryption-config\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444837 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444852 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb54914-08dc-4044-af38-9375669d5a2a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444868 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9q9\" (UniqueName: \"kubernetes.io/projected/3cb54914-08dc-4044-af38-9375669d5a2a-kube-api-access-ln9q9\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444887 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-oauth-serving-cert\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444904 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjcj\" (UniqueName: \"kubernetes.io/projected/6f0af025-acfb-4cfa-b413-6159067b8269-kube-api-access-wbjcj\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444929 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-default-certificate\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444945 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6303238b-f708-4f64-bacb-6a68e7a14425-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444959 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-etcd-client\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444975 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvzp\" (UniqueName: \"kubernetes.io/projected/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-kube-api-access-jjvzp\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.444990 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-config\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445005 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-audit-policies\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445020 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-client-ca\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445036 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckplz\" (UniqueName: \"kubernetes.io/projected/b02e39c5-31b4-4444-a500-cd7cbe327bec-kube-api-access-ckplz\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445050 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445091 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/11ca5edb-7664-4e63-a9e8-46f270623ad2-images\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445105 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-trusted-ca\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445119 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/834a438d-9f7d-4707-b790-8ce136081f7c-serving-cert\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445134 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4589bc8e-2864-4d7e-a049-a3bf264bb997-service-ca-bundle\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.445151 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-metrics-certs\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.446751 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.447123 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-audit-dir\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.447643 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/954b7859-3b75-48f2-b1ed-d70932da35ba-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.447789 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.448013 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qmfqn"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.448205 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-client-ca\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.448421 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-oauth-serving-cert\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.448508 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-trusted-ca-bundle\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.448972 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.449150 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-audit\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.449551 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.449666 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-image-import-ca\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.449773 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-config\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.450401 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-auth-proxy-config\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.450694 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-serving-cert\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.451215 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-encryption-config\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.451437 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-config\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.451495 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-oauth-config\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.451852 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-config\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.452170 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.452244 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-service-ca\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.452356 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-audit-policies\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.452460 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ca5edb-7664-4e63-a9e8-46f270623ad2-config\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.452862 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0af025-acfb-4cfa-b413-6159067b8269-serving-cert\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.453132 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-encryption-config\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.453292 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-config\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.453346 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-client-ca\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.453793 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.453862 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.453885 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.454157 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-audit-dir\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.454448 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hdk86"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.454492 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.454506 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.454534 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/11ca5edb-7664-4e63-a9e8-46f270623ad2-images\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.454773 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-ca\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.455089 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/11ca5edb-7664-4e63-a9e8-46f270623ad2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.455324 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-service-ca\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.455469 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-trusted-ca\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.455478 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f0af025-acfb-4cfa-b413-6159067b8269-etcd-client\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.455499 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b4c10-971e-4766-b632-9f710ec547a6-serving-cert\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.455545 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-node-pullsecrets\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.456450 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-serving-cert\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.456709 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.456894 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/214b71b9-85d3-4be2-be71-311e9097f89b-metrics-tls\") pod \"dns-operator-744455d44c-484jg\" (UID: \"214b71b9-85d3-4be2-be71-311e9097f89b\") " pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.457397 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-config\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.457682 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-config\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.457849 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.458250 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954b7859-3b75-48f2-b1ed-d70932da35ba-serving-cert\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.458636 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-serving-cert\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.459077 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-etcd-client\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.459221 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s87dr"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.459555 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-machine-approver-tls\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.459942 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-etcd-client\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.460032 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-serving-cert\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.460031 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-config\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.460195 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/834a438d-9f7d-4707-b790-8ce136081f7c-serving-cert\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.460311 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9032a5cf-b023-4e91-bc39-52fdd93472ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9rm4d\" (UID: \"9032a5cf-b023-4e91-bc39-52fdd93472ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.460549 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z6nb9"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.462545 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z6nb9"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.462647 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z6nb9" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.470790 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.490856 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.511656 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.519252 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ptsz7"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.520275 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.527649 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ptsz7"] Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.531210 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546661 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93be2498-3695-49d6-8bf4-ebea6010e925-metrics-tls\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546694 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93be2498-3695-49d6-8bf4-ebea6010e925-trusted-ca\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546727 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-config\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546762 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bpx\" (UniqueName: \"kubernetes.io/projected/67c5324d-bb68-4bbc-a0f6-e0452d6f4155-kube-api-access-z5bpx\") pod \"migrator-59844c95c7-fzxff\" (UID: \"67c5324d-bb68-4bbc-a0f6-e0452d6f4155\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546778 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546811 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93be2498-3695-49d6-8bf4-ebea6010e925-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546827 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.546842 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb54914-08dc-4044-af38-9375669d5a2a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547022 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9q9\" (UniqueName: \"kubernetes.io/projected/3cb54914-08dc-4044-af38-9375669d5a2a-kube-api-access-ln9q9\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547147 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-default-certificate\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547173 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6303238b-f708-4f64-bacb-6a68e7a14425-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547213 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4589bc8e-2864-4d7e-a049-a3bf264bb997-service-ca-bundle\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547229 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-metrics-certs\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547245 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-stats-auth\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547260 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/95df99ca-9a0a-465c-96b6-968c0a01235a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qmfqn\" (UID: \"95df99ca-9a0a-465c-96b6-968c0a01235a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547276 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cb54914-08dc-4044-af38-9375669d5a2a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547299 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rf5\" (UniqueName: \"kubernetes.io/projected/93be2498-3695-49d6-8bf4-ebea6010e925-kube-api-access-w5rf5\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547330 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f55d8\" (UniqueName: \"kubernetes.io/projected/4589bc8e-2864-4d7e-a049-a3bf264bb997-kube-api-access-f55d8\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547351 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbtm\" (UniqueName: \"kubernetes.io/projected/95df99ca-9a0a-465c-96b6-968c0a01235a-kube-api-access-rzbtm\") pod \"multus-admission-controller-857f4d67dd-qmfqn\" (UID: \"95df99ca-9a0a-465c-96b6-968c0a01235a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547390 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f64ln\" (UniqueName: \"kubernetes.io/projected/6303238b-f708-4f64-bacb-6a68e7a14425-kube-api-access-f64ln\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547435 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6303238b-f708-4f64-bacb-6a68e7a14425-srv-cert\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.547651 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb54914-08dc-4044-af38-9375669d5a2a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.549026 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93be2498-3695-49d6-8bf4-ebea6010e925-trusted-ca\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.550039 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-default-certificate\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.550800 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.551610 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cb54914-08dc-4044-af38-9375669d5a2a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.553633 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93be2498-3695-49d6-8bf4-ebea6010e925-metrics-tls\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.555098 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-stats-auth\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.560194 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4589bc8e-2864-4d7e-a049-a3bf264bb997-metrics-certs\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.570856 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.579000 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4589bc8e-2864-4d7e-a049-a3bf264bb997-service-ca-bundle\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.591474 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.610864 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.630478 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.650883 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.671356 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.691609 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.711195 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.730229 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.750916 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.770150 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.791269 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.810541 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.830920 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.858609 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.871752 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.892309 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.932189 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.951364 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.972695 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:13:57 crc kubenswrapper[4904]: I1205 20:13:57.998567 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.011193 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.032492 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.051050 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.071183 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.091922 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.110867 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6303238b-f708-4f64-bacb-6a68e7a14425-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.111243 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.131375 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.152034 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.170575 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.190962 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.211628 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.231878 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.251179 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.261972 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6303238b-f708-4f64-bacb-6a68e7a14425-srv-cert\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.271316 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.282442 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/95df99ca-9a0a-465c-96b6-968c0a01235a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qmfqn\" (UID: \"95df99ca-9a0a-465c-96b6-968c0a01235a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.291139 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.311843 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.331327 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.351902 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.370036 4904 request.go:700] Waited for 1.002926989s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dopenshift-kube-scheduler-operator-dockercfg-qt55r&limit=500&resourceVersion=0 Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.372050 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.391357 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.411433 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.430821 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.451042 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.472033 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.491095 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.511626 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.531181 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: E1205 20:13:58.547589 4904 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 20:13:58 crc kubenswrapper[4904]: E1205 20:13:58.547626 4904 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:13:58 crc kubenswrapper[4904]: E1205 20:13:58.547678 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-serving-cert podName:6bd0c7fa-69da-45b6-8008-8fc080ca4ba6 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:59.047655094 +0000 UTC m=+137.858871203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-serving-cert") pod "kube-controller-manager-operator-78b949d7b-jnrj8" (UID: "6bd0c7fa-69da-45b6-8008-8fc080ca4ba6") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:13:58 crc kubenswrapper[4904]: E1205 20:13:58.547723 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-config podName:6bd0c7fa-69da-45b6-8008-8fc080ca4ba6 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:59.047697475 +0000 UTC m=+137.858913675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-config") pod "kube-controller-manager-operator-78b949d7b-jnrj8" (UID: "6bd0c7fa-69da-45b6-8008-8fc080ca4ba6") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.551221 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.571626 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.590848 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.611176 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.630937 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.651445 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.692494 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.706099 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.711239 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.731748 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.752162 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.771655 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.802641 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.811039 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.830254 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.851401 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.871110 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.890954 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.911446 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.932382 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.951653 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.970858 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:13:58 crc kubenswrapper[4904]: I1205 20:13:58.990790 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.011738 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.051092 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.068996 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-config\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.069372 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.070414 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-config\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.071049 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.074329 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.091717 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.110769 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.131416 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.152235 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.172251 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.217954 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9mrn\" (UniqueName: \"kubernetes.io/projected/214b71b9-85d3-4be2-be71-311e9097f89b-kube-api-access-h9mrn\") pod \"dns-operator-744455d44c-484jg\" (UID: \"214b71b9-85d3-4be2-be71-311e9097f89b\") " pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.239618 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjcj\" (UniqueName: \"kubernetes.io/projected/6f0af025-acfb-4cfa-b413-6159067b8269-kube-api-access-wbjcj\") pod \"etcd-operator-b45778765-8p482\" (UID: \"6f0af025-acfb-4cfa-b413-6159067b8269\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.259970 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qntm8\" (UniqueName: \"kubernetes.io/projected/9b9f12dc-0021-4479-9b7e-60da4e3f27b0-kube-api-access-qntm8\") pod \"machine-approver-56656f9798-jz8jl\" (UID: \"9b9f12dc-0021-4479-9b7e-60da4e3f27b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.280641 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhnpg\" (UniqueName: \"kubernetes.io/projected/eb4b4c10-971e-4766-b632-9f710ec547a6-kube-api-access-fhnpg\") pod \"route-controller-manager-6576b87f9c-j9gnc\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.297895 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwfp\" (UniqueName: \"kubernetes.io/projected/28306b3f-ee7a-48be-905b-3ba0e6d7ef74-kube-api-access-wcwfp\") pod \"openshift-controller-manager-operator-756b6f6bc6-fmxl9\" (UID: \"28306b3f-ee7a-48be-905b-3ba0e6d7ef74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.308935 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dm6j\" (UniqueName: \"kubernetes.io/projected/dd1d7a59-ae42-455f-9617-9fa7eb152ce9-kube-api-access-9dm6j\") pod \"console-operator-58897d9998-4bbwg\" (UID: \"dd1d7a59-ae42-455f-9617-9fa7eb152ce9\") " pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.329707 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvzp\" (UniqueName: \"kubernetes.io/projected/1d4d3dc1-19e8-4648-907c-ede5dd5e107e-kube-api-access-jjvzp\") pod \"apiserver-7bbb656c7d-vw7kx\" (UID: \"1d4d3dc1-19e8-4648-907c-ede5dd5e107e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.346321 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckplz\" (UniqueName: \"kubernetes.io/projected/b02e39c5-31b4-4444-a500-cd7cbe327bec-kube-api-access-ckplz\") pod \"console-f9d7485db-pdnb5\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.374552 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb68w\" (UniqueName: \"kubernetes.io/projected/c6c9df4a-394c-4a6c-9132-5fefe0ed672d-kube-api-access-zb68w\") pod \"apiserver-76f77b778f-bvmh2\" (UID: \"c6c9df4a-394c-4a6c-9132-5fefe0ed672d\") " pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.389839 4904 request.go:700] Waited for 1.933615396s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.398201 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kc7\" (UniqueName: \"kubernetes.io/projected/9032a5cf-b023-4e91-bc39-52fdd93472ca-kube-api-access-j7kc7\") pod \"cluster-samples-operator-665b6dd947-9rm4d\" (UID: \"9032a5cf-b023-4e91-bc39-52fdd93472ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.403617 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.413704 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkvt\" (UniqueName: \"kubernetes.io/projected/954b7859-3b75-48f2-b1ed-d70932da35ba-kube-api-access-cfkvt\") pod \"openshift-config-operator-7777fb866f-sbrpp\" (UID: \"954b7859-3b75-48f2-b1ed-d70932da35ba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.426476 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcw4c\" (UniqueName: \"kubernetes.io/projected/834a438d-9f7d-4707-b790-8ce136081f7c-kube-api-access-gcw4c\") pod \"controller-manager-879f6c89f-cncqb\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.449456 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmpq6\" (UniqueName: \"kubernetes.io/projected/11ca5edb-7664-4e63-a9e8-46f270623ad2-kube-api-access-rmpq6\") pod \"machine-api-operator-5694c8668f-l928g\" (UID: \"11ca5edb-7664-4e63-a9e8-46f270623ad2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.450709 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.451196 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.467014 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.476748 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.478310 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-484jg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.491874 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.491889 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.504373 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.511372 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.517564 4904 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.521857 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.531207 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.531372 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.537645 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.551486 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.596661 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd0c7fa-69da-45b6-8008-8fc080ca4ba6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jnrj8\" (UID: \"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.612819 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bpx\" (UniqueName: \"kubernetes.io/projected/67c5324d-bb68-4bbc-a0f6-e0452d6f4155-kube-api-access-z5bpx\") pod \"migrator-59844c95c7-fzxff\" (UID: \"67c5324d-bb68-4bbc-a0f6-e0452d6f4155\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.625426 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8p482"] Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.631507 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93be2498-3695-49d6-8bf4-ebea6010e925-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.633932 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.647972 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9q9\" (UniqueName: \"kubernetes.io/projected/3cb54914-08dc-4044-af38-9375669d5a2a-kube-api-access-ln9q9\") pod \"kube-storage-version-migrator-operator-b67b599dd-t22nt\" (UID: \"3cb54914-08dc-4044-af38-9375669d5a2a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.657434 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.665671 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rf5\" (UniqueName: \"kubernetes.io/projected/93be2498-3695-49d6-8bf4-ebea6010e925-kube-api-access-w5rf5\") pod \"ingress-operator-5b745b69d9-cpsd4\" (UID: \"93be2498-3695-49d6-8bf4-ebea6010e925\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.685984 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbtm\" (UniqueName: \"kubernetes.io/projected/95df99ca-9a0a-465c-96b6-968c0a01235a-kube-api-access-rzbtm\") pod \"multus-admission-controller-857f4d67dd-qmfqn\" (UID: \"95df99ca-9a0a-465c-96b6-968c0a01235a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.686182 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.715135 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55d8\" (UniqueName: \"kubernetes.io/projected/4589bc8e-2864-4d7e-a049-a3bf264bb997-kube-api-access-f55d8\") pod \"router-default-5444994796-tkgxl\" (UID: \"4589bc8e-2864-4d7e-a049-a3bf264bb997\") " pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.730148 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f64ln\" (UniqueName: \"kubernetes.io/projected/6303238b-f708-4f64-bacb-6a68e7a14425-kube-api-access-f64ln\") pod \"olm-operator-6b444d44fb-vrtsk\" (UID: \"6303238b-f708-4f64-bacb-6a68e7a14425\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.751487 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.751999 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc"] Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.760251 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779053 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dr4\" (UniqueName: \"kubernetes.io/projected/efb530a0-f68c-4664-81bf-32871d3b8259-kube-api-access-d7dr4\") pod \"control-plane-machine-set-operator-78cbb6b69f-888hj\" (UID: \"efb530a0-f68c-4664-81bf-32871d3b8259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779106 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f5a146-90e5-47e2-a639-78a09eb00231-secret-volume\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779132 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7b6266d-9700-4622-9878-86ccda978b95-proxy-tls\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779285 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0a3c4fb-7c70-437e-8d05-7104879b59c9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779322 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779364 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnb6\" (UniqueName: \"kubernetes.io/projected/64993a4f-0592-4a1e-93e5-def54e2868ac-kube-api-access-zcnb6\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779777 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b1664bf-b83a-4582-8018-ec55e02a4068-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.779845 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvkd\" (UniqueName: \"kubernetes.io/projected/6aa0efa0-bd09-4388-b42c-11550e28712e-kube-api-access-vsvkd\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780029 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c314cca-9c67-4653-aa53-cdef478fabc2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-policies\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780124 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jlqp\" (UniqueName: \"kubernetes.io/projected/d5093d22-5bb5-4ec0-9b73-17d82af4afd7-kube-api-access-4jlqp\") pod \"package-server-manager-789f6589d5-cjzcx\" (UID: \"d5093d22-5bb5-4ec0-9b73-17d82af4afd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780154 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780180 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-bound-sa-token\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780200 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f5a146-90e5-47e2-a639-78a09eb00231-config-volume\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780250 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780328 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqgpz\" (UniqueName: \"kubernetes.io/projected/cd549d80-f654-4dd1-888c-a0b02d0e3afc-kube-api-access-zqgpz\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780365 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64993a4f-0592-4a1e-93e5-def54e2868ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780437 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tq5n\" (UniqueName: \"kubernetes.io/projected/fa95c779-64a5-468d-99b1-5f2307031409-kube-api-access-8tq5n\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780467 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q495h\" (UniqueName: \"kubernetes.io/projected/c0a3c4fb-7c70-437e-8d05-7104879b59c9-kube-api-access-q495h\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.780954 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781124 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781180 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781212 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f11bb19b-830b-4041-88ad-72df248ff8d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781319 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/efb530a0-f68c-4664-81bf-32871d3b8259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-888hj\" (UID: \"efb530a0-f68c-4664-81bf-32871d3b8259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781403 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svtxh\" (UniqueName: \"kubernetes.io/projected/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-kube-api-access-svtxh\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781449 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781505 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781529 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wqc\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-kube-api-access-s7wqc\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781582 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-dir\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781682 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781726 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781742 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61a48cd-8429-43a1-a7ef-50d57f4b397f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.781779 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: E1205 20:13:59.781982 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.281966093 +0000 UTC m=+139.093182282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782342 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-config\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782421 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d61a48cd-8429-43a1-a7ef-50d57f4b397f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782565 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-signing-key\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782588 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782629 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/64993a4f-0592-4a1e-93e5-def54e2868ac-tmpfs\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782667 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-tls\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782705 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-trusted-ca\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782722 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd549d80-f654-4dd1-888c-a0b02d0e3afc-config\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782740 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782777 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd549d80-f654-4dd1-888c-a0b02d0e3afc-serving-cert\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782803 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95c779-64a5-468d-99b1-5f2307031409-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782819 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-certificates\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782880 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4702bfcc-3b70-4600-8e4f-9137016423a6-serving-cert\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.782941 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783114 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltr7\" (UniqueName: \"kubernetes.io/projected/d8f5a146-90e5-47e2-a639-78a09eb00231-kube-api-access-hltr7\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783212 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0a3c4fb-7c70-437e-8d05-7104879b59c9-proxy-tls\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783655 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b1664bf-b83a-4582-8018-ec55e02a4068-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783683 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783701 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcgcm\" (UniqueName: \"kubernetes.io/projected/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-kube-api-access-lcgcm\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783766 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bppmw\" (UniqueName: \"kubernetes.io/projected/d7b6266d-9700-4622-9878-86ccda978b95-kube-api-access-bppmw\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783784 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64993a4f-0592-4a1e-93e5-def54e2868ac-webhook-cert\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783806 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783822 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4xh\" (UniqueName: \"kubernetes.io/projected/f11bb19b-830b-4041-88ad-72df248ff8d1-kube-api-access-rh4xh\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783859 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783876 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c314cca-9c67-4653-aa53-cdef478fabc2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783892 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szvw\" (UniqueName: \"kubernetes.io/projected/9c314cca-9c67-4653-aa53-cdef478fabc2-kube-api-access-2szvw\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.783907 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f11bb19b-830b-4041-88ad-72df248ff8d1-srv-cert\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.784473 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5093d22-5bb5-4ec0-9b73-17d82af4afd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cjzcx\" (UID: \"d5093d22-5bb5-4ec0-9b73-17d82af4afd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.784564 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa95c779-64a5-468d-99b1-5f2307031409-config\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.784843 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c314cca-9c67-4653-aa53-cdef478fabc2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.784893 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61a48cd-8429-43a1-a7ef-50d57f4b397f-config\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785453 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785623 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7b6266d-9700-4622-9878-86ccda978b95-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785657 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785720 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6nq\" (UniqueName: \"kubernetes.io/projected/4702bfcc-3b70-4600-8e4f-9137016423a6-kube-api-access-bp6nq\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785742 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rwpq\" (UniqueName: \"kubernetes.io/projected/a70ee695-4cd0-4ad2-926f-4850e19e480f-kube-api-access-4rwpq\") pod \"downloads-7954f5f757-672dp\" (UID: \"a70ee695-4cd0-4ad2-926f-4850e19e480f\") " pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785765 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-signing-cabundle\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.785799 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0a3c4fb-7c70-437e-8d05-7104879b59c9-images\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.810116 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.824394 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" Dec 05 20:13:59 crc kubenswrapper[4904]: W1205 20:13:59.864636 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4b4c10_971e_4766_b632_9f710ec547a6.slice/crio-a8cd5364e3c549b9a1db98a9f1a16e750633bd970396714d9bd256a95a371d22 WatchSource:0}: Error finding container a8cd5364e3c549b9a1db98a9f1a16e750633bd970396714d9bd256a95a371d22: Status 404 returned error can't find the container with id a8cd5364e3c549b9a1db98a9f1a16e750633bd970396714d9bd256a95a371d22 Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.881548 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896267 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896630 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-signing-cabundle\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896679 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0a3c4fb-7c70-437e-8d05-7104879b59c9-images\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896713 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dr4\" (UniqueName: \"kubernetes.io/projected/efb530a0-f68c-4664-81bf-32871d3b8259-kube-api-access-d7dr4\") pod \"control-plane-machine-set-operator-78cbb6b69f-888hj\" (UID: \"efb530a0-f68c-4664-81bf-32871d3b8259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896742 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f5a146-90e5-47e2-a639-78a09eb00231-secret-volume\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896761 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7b6266d-9700-4622-9878-86ccda978b95-proxy-tls\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896809 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-registration-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896828 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0a3c4fb-7c70-437e-8d05-7104879b59c9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896845 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896886 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnb6\" (UniqueName: \"kubernetes.io/projected/64993a4f-0592-4a1e-93e5-def54e2868ac-kube-api-access-zcnb6\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896912 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b1664bf-b83a-4582-8018-ec55e02a4068-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896935 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsvkd\" (UniqueName: \"kubernetes.io/projected/6aa0efa0-bd09-4388-b42c-11550e28712e-kube-api-access-vsvkd\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.896968 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxnwc\" (UniqueName: \"kubernetes.io/projected/a9a77bf3-5405-4051-abed-850371e2cfcc-kube-api-access-fxnwc\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897040 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c314cca-9c67-4653-aa53-cdef478fabc2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897087 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-policies\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897113 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-plugins-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897137 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jlqp\" (UniqueName: \"kubernetes.io/projected/d5093d22-5bb5-4ec0-9b73-17d82af4afd7-kube-api-access-4jlqp\") pod \"package-server-manager-789f6589d5-cjzcx\" (UID: \"d5093d22-5bb5-4ec0-9b73-17d82af4afd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897167 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897191 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a77bf3-5405-4051-abed-850371e2cfcc-metrics-tls\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897235 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f5a146-90e5-47e2-a639-78a09eb00231-config-volume\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897274 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-bound-sa-token\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897304 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897331 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqgpz\" (UniqueName: \"kubernetes.io/projected/cd549d80-f654-4dd1-888c-a0b02d0e3afc-kube-api-access-zqgpz\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897357 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64993a4f-0592-4a1e-93e5-def54e2868ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897386 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tq5n\" (UniqueName: \"kubernetes.io/projected/fa95c779-64a5-468d-99b1-5f2307031409-kube-api-access-8tq5n\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897437 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q495h\" (UniqueName: \"kubernetes.io/projected/c0a3c4fb-7c70-437e-8d05-7104879b59c9-kube-api-access-q495h\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897464 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hpvk\" (UniqueName: \"kubernetes.io/projected/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-kube-api-access-6hpvk\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897490 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-socket-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897565 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897599 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897627 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-node-bootstrap-token\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897658 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897683 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f11bb19b-830b-4041-88ad-72df248ff8d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897714 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/efb530a0-f68c-4664-81bf-32871d3b8259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-888hj\" (UID: \"efb530a0-f68c-4664-81bf-32871d3b8259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897750 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svtxh\" (UniqueName: \"kubernetes.io/projected/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-kube-api-access-svtxh\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897779 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-csi-data-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897834 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wqc\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-kube-api-access-s7wqc\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897862 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-dir\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897892 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897970 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.897995 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61a48cd-8429-43a1-a7ef-50d57f4b397f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.898025 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-mountpoint-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.898097 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.898924 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-signing-cabundle\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: E1205 20:13:59.905526 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.405480953 +0000 UTC m=+139.216697062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.906185 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f11bb19b-830b-4041-88ad-72df248ff8d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.907271 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64993a4f-0592-4a1e-93e5-def54e2868ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.907621 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-policies\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.907653 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-dir\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.907944 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b1664bf-b83a-4582-8018-ec55e02a4068-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.908130 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0a3c4fb-7c70-437e-8d05-7104879b59c9-images\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.908713 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f5a146-90e5-47e2-a639-78a09eb00231-secret-volume\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.909386 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.910171 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.911827 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61a48cd-8429-43a1-a7ef-50d57f4b397f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.912161 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.912318 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f5a146-90e5-47e2-a639-78a09eb00231-config-volume\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.912502 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0a3c4fb-7c70-437e-8d05-7104879b59c9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.916019 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7b6266d-9700-4622-9878-86ccda978b95-proxy-tls\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.917374 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918048 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-655m2\" (UniqueName: \"kubernetes.io/projected/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-kube-api-access-655m2\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918153 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-config\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918288 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9a77bf3-5405-4051-abed-850371e2cfcc-config-volume\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918329 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d61a48cd-8429-43a1-a7ef-50d57f4b397f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918350 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/64993a4f-0592-4a1e-93e5-def54e2868ac-tmpfs\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-signing-key\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.918416 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920648 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-tls\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920688 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-trusted-ca\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920719 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd549d80-f654-4dd1-888c-a0b02d0e3afc-config\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920771 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920797 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd549d80-f654-4dd1-888c-a0b02d0e3afc-serving-cert\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920829 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95c779-64a5-468d-99b1-5f2307031409-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.920872 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-certificates\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.921311 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/64993a4f-0592-4a1e-93e5-def54e2868ac-tmpfs\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.922105 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.922462 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923618 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4702bfcc-3b70-4600-8e4f-9137016423a6-serving-cert\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923660 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-certs\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923700 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923733 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltr7\" (UniqueName: \"kubernetes.io/projected/d8f5a146-90e5-47e2-a639-78a09eb00231-kube-api-access-hltr7\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923759 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0a3c4fb-7c70-437e-8d05-7104879b59c9-proxy-tls\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923810 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcgcm\" (UniqueName: \"kubernetes.io/projected/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-kube-api-access-lcgcm\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923851 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b1664bf-b83a-4582-8018-ec55e02a4068-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923877 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923906 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bppmw\" (UniqueName: \"kubernetes.io/projected/d7b6266d-9700-4622-9878-86ccda978b95-kube-api-access-bppmw\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923969 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64993a4f-0592-4a1e-93e5-def54e2868ac-webhook-cert\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.923995 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924022 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4xh\" (UniqueName: \"kubernetes.io/projected/f11bb19b-830b-4041-88ad-72df248ff8d1-kube-api-access-rh4xh\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924049 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2szvw\" (UniqueName: \"kubernetes.io/projected/9c314cca-9c67-4653-aa53-cdef478fabc2-kube-api-access-2szvw\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924579 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f11bb19b-830b-4041-88ad-72df248ff8d1-srv-cert\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924612 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924638 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c314cca-9c67-4653-aa53-cdef478fabc2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924722 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5093d22-5bb5-4ec0-9b73-17d82af4afd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cjzcx\" (UID: \"d5093d22-5bb5-4ec0-9b73-17d82af4afd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924752 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpmt6\" (UniqueName: \"kubernetes.io/projected/86e5e62d-13a5-42bf-91ad-759b8be371f8-kube-api-access-rpmt6\") pod \"ingress-canary-c67m2\" (UID: \"86e5e62d-13a5-42bf-91ad-759b8be371f8\") " pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924834 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa95c779-64a5-468d-99b1-5f2307031409-config\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924856 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c314cca-9c67-4653-aa53-cdef478fabc2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.924873 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61a48cd-8429-43a1-a7ef-50d57f4b397f-config\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.925101 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.925129 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7b6266d-9700-4622-9878-86ccda978b95-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.925149 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.925167 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6nq\" (UniqueName: \"kubernetes.io/projected/4702bfcc-3b70-4600-8e4f-9137016423a6-kube-api-access-bp6nq\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.925186 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rwpq\" (UniqueName: \"kubernetes.io/projected/a70ee695-4cd0-4ad2-926f-4850e19e480f-kube-api-access-4rwpq\") pod \"downloads-7954f5f757-672dp\" (UID: \"a70ee695-4cd0-4ad2-926f-4850e19e480f\") " pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.925205 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e5e62d-13a5-42bf-91ad-759b8be371f8-cert\") pod \"ingress-canary-c67m2\" (UID: \"86e5e62d-13a5-42bf-91ad-759b8be371f8\") " pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.926143 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa95c779-64a5-468d-99b1-5f2307031409-config\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.926166 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-certificates\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.926653 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61a48cd-8429-43a1-a7ef-50d57f4b397f-config\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.927459 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c314cca-9c67-4653-aa53-cdef478fabc2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.927908 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.928816 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-signing-key\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.935544 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.945441 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5093d22-5bb5-4ec0-9b73-17d82af4afd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cjzcx\" (UID: \"d5093d22-5bb5-4ec0-9b73-17d82af4afd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.947126 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsvkd\" (UniqueName: \"kubernetes.io/projected/6aa0efa0-bd09-4388-b42c-11550e28712e-kube-api-access-vsvkd\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.947290 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7b6266d-9700-4622-9878-86ccda978b95-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.947357 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.947935 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.948343 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c314cca-9c67-4653-aa53-cdef478fabc2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.949558 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.950546 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd549d80-f654-4dd1-888c-a0b02d0e3afc-config\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.951678 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.951730 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.951835 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-trusted-ca\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.952367 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.952860 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0a3c4fb-7c70-437e-8d05-7104879b59c9-proxy-tls\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.953216 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/efb530a0-f68c-4664-81bf-32871d3b8259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-888hj\" (UID: \"efb530a0-f68c-4664-81bf-32871d3b8259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.953839 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-tls\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.959492 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd549d80-f654-4dd1-888c-a0b02d0e3afc-serving-cert\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.961539 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5srbq\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.967845 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f11bb19b-830b-4041-88ad-72df248ff8d1-srv-cert\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.968901 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jlqp\" (UniqueName: \"kubernetes.io/projected/d5093d22-5bb5-4ec0-9b73-17d82af4afd7-kube-api-access-4jlqp\") pod \"package-server-manager-789f6589d5-cjzcx\" (UID: \"d5093d22-5bb5-4ec0-9b73-17d82af4afd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.968207 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64993a4f-0592-4a1e-93e5-def54e2868ac-webhook-cert\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.975506 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95c779-64a5-468d-99b1-5f2307031409-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:13:59 crc kubenswrapper[4904]: I1205 20:13:59.978517 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b1664bf-b83a-4582-8018-ec55e02a4068-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.019418 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.023535 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q495h\" (UniqueName: \"kubernetes.io/projected/c0a3c4fb-7c70-437e-8d05-7104879b59c9-kube-api-access-q495h\") pod \"machine-config-operator-74547568cd-479fl\" (UID: \"c0a3c4fb-7c70-437e-8d05-7104879b59c9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027550 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e5e62d-13a5-42bf-91ad-759b8be371f8-cert\") pod \"ingress-canary-c67m2\" (UID: \"86e5e62d-13a5-42bf-91ad-759b8be371f8\") " pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027607 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-registration-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027647 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxnwc\" (UniqueName: \"kubernetes.io/projected/a9a77bf3-5405-4051-abed-850371e2cfcc-kube-api-access-fxnwc\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027679 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-plugins-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027702 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a77bf3-5405-4051-abed-850371e2cfcc-metrics-tls\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027747 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hpvk\" (UniqueName: \"kubernetes.io/projected/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-kube-api-access-6hpvk\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027771 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-socket-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-node-bootstrap-token\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027835 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-csi-data-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027870 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027914 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-mountpoint-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027938 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-655m2\" (UniqueName: \"kubernetes.io/projected/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-kube-api-access-655m2\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.027979 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9a77bf3-5405-4051-abed-850371e2cfcc-config-volume\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.028044 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-certs\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.028156 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpmt6\" (UniqueName: \"kubernetes.io/projected/86e5e62d-13a5-42bf-91ad-759b8be371f8-kube-api-access-rpmt6\") pod \"ingress-canary-c67m2\" (UID: \"86e5e62d-13a5-42bf-91ad-759b8be371f8\") " pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.028539 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-socket-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.028616 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-plugins-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.029214 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-mountpoint-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.029446 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9a77bf3-5405-4051-abed-850371e2cfcc-config-volume\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.029560 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-csi-data-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.029603 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-registration-dir\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.029632 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.529611172 +0000 UTC m=+139.340827281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.032773 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-node-bootstrap-token\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.035009 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e5e62d-13a5-42bf-91ad-759b8be371f8-cert\") pod \"ingress-canary-c67m2\" (UID: \"86e5e62d-13a5-42bf-91ad-759b8be371f8\") " pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.036551 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-certs\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.052185 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqgpz\" (UniqueName: \"kubernetes.io/projected/cd549d80-f654-4dd1-888c-a0b02d0e3afc-kube-api-access-zqgpz\") pod \"service-ca-operator-777779d784-jw92f\" (UID: \"cd549d80-f654-4dd1-888c-a0b02d0e3afc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.072119 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnb6\" (UniqueName: \"kubernetes.io/projected/64993a4f-0592-4a1e-93e5-def54e2868ac-kube-api-access-zcnb6\") pod \"packageserver-d55dfcdfc-zdlcn\" (UID: \"64993a4f-0592-4a1e-93e5-def54e2868ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.073265 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.088614 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wqc\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-kube-api-access-s7wqc\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.099348 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.112253 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c314cca-9c67-4653-aa53-cdef478fabc2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.112510 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pdnb5"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.113765 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.124973 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bvmh2"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.126283 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-484jg"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.128936 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.129527 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.62949755 +0000 UTC m=+139.440713659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.132279 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tq5n\" (UniqueName: \"kubernetes.io/projected/fa95c779-64a5-468d-99b1-5f2307031409-kube-api-access-8tq5n\") pod \"openshift-apiserver-operator-796bbdcf4f-92b57\" (UID: \"fa95c779-64a5-468d-99b1-5f2307031409\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.132967 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.164856 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-bound-sa-token\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.182846 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d61a48cd-8429-43a1-a7ef-50d57f4b397f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8x2kf\" (UID: \"d61a48cd-8429-43a1-a7ef-50d57f4b397f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.188722 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.228733 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rwpq\" (UniqueName: \"kubernetes.io/projected/a70ee695-4cd0-4ad2-926f-4850e19e480f-kube-api-access-4rwpq\") pod \"downloads-7954f5f757-672dp\" (UID: \"a70ee695-4cd0-4ad2-926f-4850e19e480f\") " pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.232384 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.232992 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.732972703 +0000 UTC m=+139.544188812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.246217 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltr7\" (UniqueName: \"kubernetes.io/projected/d8f5a146-90e5-47e2-a639-78a09eb00231-kube-api-access-hltr7\") pod \"collect-profiles-29416080-82bgg\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.265148 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szvw\" (UniqueName: \"kubernetes.io/projected/9c314cca-9c67-4653-aa53-cdef478fabc2-kube-api-access-2szvw\") pod \"cluster-image-registry-operator-dc59b4c8b-sbdkl\" (UID: \"9c314cca-9c67-4653-aa53-cdef478fabc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.292396 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dr4\" (UniqueName: \"kubernetes.io/projected/efb530a0-f68c-4664-81bf-32871d3b8259-kube-api-access-d7dr4\") pod \"control-plane-machine-set-operator-78cbb6b69f-888hj\" (UID: \"efb530a0-f68c-4664-81bf-32871d3b8259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.292498 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a77bf3-5405-4051-abed-850371e2cfcc-metrics-tls\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.292734 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.292735 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.293820 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.294093 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/107b25d3-8a8d-4e82-9a93-d81251e8ff8d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wszp4\" (UID: \"107b25d3-8a8d-4e82-9a93-d81251e8ff8d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.294126 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4702bfcc-3b70-4600-8e4f-9137016423a6-config\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.294398 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.295446 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.296233 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4702bfcc-3b70-4600-8e4f-9137016423a6-serving-cert\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.298847 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4bbwg"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.303442 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.304130 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcgcm\" (UniqueName: \"kubernetes.io/projected/6b6ea474-55bf-4540-a7b4-cdf769dd5d23-kube-api-access-lcgcm\") pod \"service-ca-9c57cc56f-zq5jf\" (UID: \"6b6ea474-55bf-4540-a7b4-cdf769dd5d23\") " pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.308086 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6nq\" (UniqueName: \"kubernetes.io/projected/4702bfcc-3b70-4600-8e4f-9137016423a6-kube-api-access-bp6nq\") pod \"authentication-operator-69f744f599-hpnlq\" (UID: \"4702bfcc-3b70-4600-8e4f-9137016423a6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.310051 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.310848 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.310865 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.312430 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bppmw\" (UniqueName: \"kubernetes.io/projected/d7b6266d-9700-4622-9878-86ccda978b95-kube-api-access-bppmw\") pod \"machine-config-controller-84d6567774-d5xch\" (UID: \"d7b6266d-9700-4622-9878-86ccda978b95\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.312511 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svtxh\" (UniqueName: \"kubernetes.io/projected/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-kube-api-access-svtxh\") pod \"oauth-openshift-558db77b4-s87dr\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.329555 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4xh\" (UniqueName: \"kubernetes.io/projected/f11bb19b-830b-4041-88ad-72df248ff8d1-kube-api-access-rh4xh\") pod \"catalog-operator-68c6474976-jg5fm\" (UID: \"f11bb19b-830b-4041-88ad-72df248ff8d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.333753 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.333906 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.833886062 +0000 UTC m=+139.645102171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.334042 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.334305 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.334328 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.834315215 +0000 UTC m=+139.645531324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.341933 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.365708 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.367015 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpmt6\" (UniqueName: \"kubernetes.io/projected/86e5e62d-13a5-42bf-91ad-759b8be371f8-kube-api-access-rpmt6\") pod \"ingress-canary-c67m2\" (UID: \"86e5e62d-13a5-42bf-91ad-759b8be371f8\") " pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.381887 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.395350 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxnwc\" (UniqueName: \"kubernetes.io/projected/a9a77bf3-5405-4051-abed-850371e2cfcc-kube-api-access-fxnwc\") pod \"dns-default-z6nb9\" (UID: \"a9a77bf3-5405-4051-abed-850371e2cfcc\") " pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.400199 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.401232 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cncqb"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.405356 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.405795 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l928g"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.411687 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qmfqn"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.416985 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-655m2\" (UniqueName: \"kubernetes.io/projected/fe343c2a-87f6-45d4-a91d-3f86b9b5029b-kube-api-access-655m2\") pod \"csi-hostpathplugin-ptsz7\" (UID: \"fe343c2a-87f6-45d4-a91d-3f86b9b5029b\") " pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.417265 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.418657 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.433123 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hpvk\" (UniqueName: \"kubernetes.io/projected/0b51cbb6-b3e4-454b-a58e-086eca25cb7d-kube-api-access-6hpvk\") pod \"machine-config-server-brbpq\" (UID: \"0b51cbb6-b3e4-454b-a58e-086eca25cb7d\") " pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.434488 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c67m2" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.435043 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.435224 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.935189813 +0000 UTC m=+139.746405932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.435277 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.435812 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:00.93580314 +0000 UTC m=+139.747019249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.439876 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-brbpq" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.446315 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.447541 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.466488 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.471815 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdnb5" event={"ID":"b02e39c5-31b4-4444-a500-cd7cbe327bec","Type":"ContainerStarted","Data":"5b9594bdee326e7568932fd1ba992bbc93a7156a19a69da11b663ef7ac69d3fc"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.472779 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" event={"ID":"6f0af025-acfb-4cfa-b413-6159067b8269","Type":"ContainerStarted","Data":"82b3bf294494d4fc4d42f20203f7c42f4b2c58445459876be2d694fbadb67c4a"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.472806 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" event={"ID":"6f0af025-acfb-4cfa-b413-6159067b8269","Type":"ContainerStarted","Data":"19a052d6d498c9fe025d69a0572439704e8bc03131a78bf07ad18dedb31ce7f3"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.473415 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" event={"ID":"954b7859-3b75-48f2-b1ed-d70932da35ba","Type":"ContainerStarted","Data":"d5152cafc0c5c48cb3411e669fde54490f0d57ff08cd80456459766293a8ba20"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.473980 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-484jg" event={"ID":"214b71b9-85d3-4be2-be71-311e9097f89b","Type":"ContainerStarted","Data":"ef25010da9a4b54e024b8958627eb59f60fa53001aaa66dbf90e10c314908ce9"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.474766 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" event={"ID":"c6c9df4a-394c-4a6c-9132-5fefe0ed672d","Type":"ContainerStarted","Data":"abb4500558d7829f7d65896040e132909fe1cd8d32814b7aa3f79d45eca72885"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.480588 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" event={"ID":"9b9f12dc-0021-4479-9b7e-60da4e3f27b0","Type":"ContainerStarted","Data":"f3a643e6a911484bd636eba1758e47326b10079f6c179d92337dbf1217d8e059"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.480625 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" event={"ID":"9b9f12dc-0021-4479-9b7e-60da4e3f27b0","Type":"ContainerStarted","Data":"45172b54a5eefb358b99fb7c3d430e0e61924b555dfd27ba5810bdc81e36fa0a"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.482656 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" event={"ID":"6303238b-f708-4f64-bacb-6a68e7a14425","Type":"ContainerStarted","Data":"ec527cd90b4b0dacb7e4859dee945ab2ab97fb4c458138d97a369ba734a1b0c8"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.490253 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.490291 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" event={"ID":"28306b3f-ee7a-48be-905b-3ba0e6d7ef74","Type":"ContainerStarted","Data":"a545ab0810c7825ac8866037c4f39f8b2d4cc34d2431140b2175fc85873f8175"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.492746 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" event={"ID":"eb4b4c10-971e-4766-b632-9f710ec547a6","Type":"ContainerStarted","Data":"a8cd5364e3c549b9a1db98a9f1a16e750633bd970396714d9bd256a95a371d22"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.496926 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.502138 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.518735 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" event={"ID":"dd1d7a59-ae42-455f-9617-9fa7eb152ce9","Type":"ContainerStarted","Data":"ed39bbc0b21bd20b0c5a6eb3dd4ad5e660a9dc9e959224618afa6c8c0b82fa2b"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.520711 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" event={"ID":"1d4d3dc1-19e8-4648-907c-ede5dd5e107e","Type":"ContainerStarted","Data":"bf1770b0bde154bed1e41dfc229333d0156b189cdf8b797af868e7457ac49ed4"} Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.536022 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.536411 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.03639633 +0000 UTC m=+139.847612439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.577022 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5srbq"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.592486 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.602019 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.625676 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:14:00 crc kubenswrapper[4904]: W1205 20:14:00.635098 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb54914_08dc_4044_af38_9375669d5a2a.slice/crio-e3673452ee2a6edc3afaf7ad40304d59320f2544c7fecc9a72f58d7d48b0edb3 WatchSource:0}: Error finding container e3673452ee2a6edc3afaf7ad40304d59320f2544c7fecc9a72f58d7d48b0edb3: Status 404 returned error can't find the container with id e3673452ee2a6edc3afaf7ad40304d59320f2544c7fecc9a72f58d7d48b0edb3 Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.639842 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.640231 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.140215693 +0000 UTC m=+139.951431802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.741013 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.741360 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.241331017 +0000 UTC m=+140.052547136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.741406 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.741726 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.241714768 +0000 UTC m=+140.052930877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.821214 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.841890 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.842036 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.34201615 +0000 UTC m=+140.153232259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.842295 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.842655 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.342612617 +0000 UTC m=+140.153828726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.908668 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.911635 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jw92f"] Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.943820 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.943991 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.443966478 +0000 UTC m=+140.255182587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:00 crc kubenswrapper[4904]: I1205 20:14:00.944036 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:00 crc kubenswrapper[4904]: E1205 20:14:00.944318 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.444311468 +0000 UTC m=+140.255527577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: W1205 20:14:01.037531 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4589bc8e_2864_4d7e_a049_a3bf264bb997.slice/crio-adc7214960aceb7be45b16c31538aa46477609b162379fb5d4799abb44d4957b WatchSource:0}: Error finding container adc7214960aceb7be45b16c31538aa46477609b162379fb5d4799abb44d4957b: Status 404 returned error can't find the container with id adc7214960aceb7be45b16c31538aa46477609b162379fb5d4799abb44d4957b Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.046439 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.047117 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.547090051 +0000 UTC m=+140.358306200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.148470 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.148912 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.648893955 +0000 UTC m=+140.460110054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.252697 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.253018 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.753005586 +0000 UTC m=+140.564221695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.354192 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.354509 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.854498472 +0000 UTC m=+140.665714581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.442213 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.455335 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.455735 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:01.95572135 +0000 UTC m=+140.766937459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.520974 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-479fl"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.556848 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s87dr"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.558599 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.558937 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.058924005 +0000 UTC m=+140.870140114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.563793 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpnlq"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.564934 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zq5jf"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.581230 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.584797 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" event={"ID":"dd1d7a59-ae42-455f-9617-9fa7eb152ce9","Type":"ContainerStarted","Data":"889798f55f896e040e343c8b848a5eef22f93d3e891a961f70f139ab98c18da7"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.584835 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.588261 4904 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bbwg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.588293 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" podUID="dd1d7a59-ae42-455f-9617-9fa7eb152ce9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.601682 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" event={"ID":"954b7859-3b75-48f2-b1ed-d70932da35ba","Type":"ContainerStarted","Data":"35da78207cc8bbba8779dda95115a399bafb70ecb93b5aa10fb43a39423e8798"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.603488 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" event={"ID":"d5093d22-5bb5-4ec0-9b73-17d82af4afd7","Type":"ContainerStarted","Data":"a1b359d057acc1e54fce9184c3a59a24289cc9a9602d6590bcbf661a3417c237"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.608329 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.621249 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj"] Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.644398 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" event={"ID":"95df99ca-9a0a-465c-96b6-968c0a01235a","Type":"ContainerStarted","Data":"de15425573edeee172285dfc7ee033dee7eb395f3a1d4e76440f87c9f6ded8f9"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.658528 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" event={"ID":"6303238b-f708-4f64-bacb-6a68e7a14425","Type":"ContainerStarted","Data":"c14d11c62635af7f0b3c822e7c74e51b03237d708d28b84e86661d1290b5e911"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.659401 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.659956 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.661274 4904 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vrtsk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.661312 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" podUID="6303238b-f708-4f64-bacb-6a68e7a14425" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.661419 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.161401808 +0000 UTC m=+140.972617977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.669949 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" event={"ID":"834a438d-9f7d-4707-b790-8ce136081f7c","Type":"ContainerStarted","Data":"52ab922c318333b9aa6ad379b5508622f6f1ed384f1f7737d590b5ae1c26fcd4"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.672478 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tkgxl" event={"ID":"4589bc8e-2864-4d7e-a049-a3bf264bb997","Type":"ContainerStarted","Data":"adc7214960aceb7be45b16c31538aa46477609b162379fb5d4799abb44d4957b"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.677994 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" event={"ID":"11ca5edb-7664-4e63-a9e8-46f270623ad2","Type":"ContainerStarted","Data":"65946099b7f3466471850f34d1d3578f0cc22f6d2020ffcb2eb1834783126154"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.755483 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" event={"ID":"9032a5cf-b023-4e91-bc39-52fdd93472ca","Type":"ContainerStarted","Data":"9816577b843673304dc1b8d6a686a9b945ac4308841edbd9e82b34c3158bd3a2"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.759316 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" event={"ID":"3cb54914-08dc-4044-af38-9375669d5a2a","Type":"ContainerStarted","Data":"e3673452ee2a6edc3afaf7ad40304d59320f2544c7fecc9a72f58d7d48b0edb3"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.761205 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.761929 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.261909735 +0000 UTC m=+141.073125904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.768534 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" event={"ID":"93be2498-3695-49d6-8bf4-ebea6010e925","Type":"ContainerStarted","Data":"30dc55fe0cc31c8bc8bb0f2a051ca208fbad18d434479b64e4940d307543cd4a"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.769362 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" event={"ID":"93be2498-3695-49d6-8bf4-ebea6010e925","Type":"ContainerStarted","Data":"68866ca1a36fb836e0bdddd28a773184c02cae06eef2034aebbc0fb66982eb45"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.771632 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-brbpq" event={"ID":"0b51cbb6-b3e4-454b-a58e-086eca25cb7d","Type":"ContainerStarted","Data":"cf772274fddba182025764b236f60c173b6f04d9c3dae85137d65a1b48a8744f"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.843715 4904 generic.go:334] "Generic (PLEG): container finished" podID="c6c9df4a-394c-4a6c-9132-5fefe0ed672d" containerID="fe3122417dd0d388c3183774822f1d42128047fbdc06e5740cdd2e799e772b66" exitCode=0 Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.844073 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" event={"ID":"c6c9df4a-394c-4a6c-9132-5fefe0ed672d","Type":"ContainerDied","Data":"fe3122417dd0d388c3183774822f1d42128047fbdc06e5740cdd2e799e772b66"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.863266 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" event={"ID":"cd549d80-f654-4dd1-888c-a0b02d0e3afc","Type":"ContainerStarted","Data":"b018444953349466df7910e84e94dd8e62a8b1db9b542787843fe8fbb2c4a67e"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.866501 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.866720 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.366678336 +0000 UTC m=+141.177894445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.866948 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.867587 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.367567821 +0000 UTC m=+141.178784020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.874307 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" event={"ID":"28306b3f-ee7a-48be-905b-3ba0e6d7ef74","Type":"ContainerStarted","Data":"a68d4e5d38567d4c4c1bc42f735be9d49c3f28f184a1f153a06de70bf5992bd2"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.879592 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" event={"ID":"eb4b4c10-971e-4766-b632-9f710ec547a6","Type":"ContainerStarted","Data":"8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.883279 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.883480 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" event={"ID":"67c5324d-bb68-4bbc-a0f6-e0452d6f4155","Type":"ContainerStarted","Data":"09b010fc8ed64fb88a92e328ed3ee546a9cec7072b56a11e201e8e8abf81d069"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.884111 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" event={"ID":"6aa0efa0-bd09-4388-b42c-11550e28712e","Type":"ContainerStarted","Data":"d0f777344c89dfe481cdd7016f8f2826e1366d7ea172b722ccbd9170fe01063d"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.910307 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-484jg" event={"ID":"214b71b9-85d3-4be2-be71-311e9097f89b","Type":"ContainerStarted","Data":"c72a976e50dc11bfa7909f0ad85ef3c9cae6918d49d8d0c74aa5babe49a386e7"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.913656 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" event={"ID":"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6","Type":"ContainerStarted","Data":"9ad80129e9d23bb215b4cd3224080397b3180ad3c80c1070b614f79bf76196fc"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.920015 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" event={"ID":"fa95c779-64a5-468d-99b1-5f2307031409","Type":"ContainerStarted","Data":"58c33cfa3d38207fb1e11f34301f1d99c15a4f9f48a88ac2aee9824bda98720e"} Dec 05 20:14:01 crc kubenswrapper[4904]: I1205 20:14:01.968561 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:01 crc kubenswrapper[4904]: E1205 20:14:01.970626 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.470604291 +0000 UTC m=+141.281820390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.074364 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.074687 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.574673041 +0000 UTC m=+141.385889150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.171670 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.175781 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.176151 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.676138476 +0000 UTC m=+141.487354575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.209400 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" podStartSLOduration=121.209378734 podStartE2EDuration="2m1.209378734s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:02.205415089 +0000 UTC m=+141.016631198" watchObservedRunningTime="2025-12-05 20:14:02.209378734 +0000 UTC m=+141.020594843" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.286597 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.287556 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.787545167 +0000 UTC m=+141.598761276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.344653 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" podStartSLOduration=120.344637263 podStartE2EDuration="2m0.344637263s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:02.343017256 +0000 UTC m=+141.154233375" watchObservedRunningTime="2025-12-05 20:14:02.344637263 +0000 UTC m=+141.155853372" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.387884 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.388280 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:02.88826047 +0000 UTC m=+141.699476569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.407458 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.434395 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-672dp"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.476132 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.512566 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.517251 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.523341 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.023325304 +0000 UTC m=+141.834541413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.553518 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fmxl9" podStartSLOduration=121.553501084 podStartE2EDuration="2m1.553501084s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:02.498818007 +0000 UTC m=+141.310034116" watchObservedRunningTime="2025-12-05 20:14:02.553501084 +0000 UTC m=+141.364717193" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.588970 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8p482" podStartSLOduration=121.588954375 podStartE2EDuration="2m1.588954375s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:02.528012239 +0000 UTC m=+141.339228368" watchObservedRunningTime="2025-12-05 20:14:02.588954375 +0000 UTC m=+141.400170484" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.605871 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.619680 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" podStartSLOduration=120.61966284 podStartE2EDuration="2m0.61966284s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:02.605845682 +0000 UTC m=+141.417061801" watchObservedRunningTime="2025-12-05 20:14:02.61966284 +0000 UTC m=+141.430878939" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.619929 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z6nb9"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.634951 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c67m2"] Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.669017 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.669360 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.169344072 +0000 UTC m=+141.980560181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.679457 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ptsz7"] Dec 05 20:14:02 crc kubenswrapper[4904]: W1205 20:14:02.717965 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11bb19b_830b_4041_88ad_72df248ff8d1.slice/crio-b4cff67feb7aa7b16c358e41af3a8dba75f6f0727e8901d0841efdf9a991cd29 WatchSource:0}: Error finding container b4cff67feb7aa7b16c358e41af3a8dba75f6f0727e8901d0841efdf9a991cd29: Status 404 returned error can't find the container with id b4cff67feb7aa7b16c358e41af3a8dba75f6f0727e8901d0841efdf9a991cd29 Dec 05 20:14:02 crc kubenswrapper[4904]: W1205 20:14:02.740430 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe343c2a_87f6_45d4_a91d_3f86b9b5029b.slice/crio-14dc48bac2393ff2e01290b3390794e144c573bdb8333a562f1199f26a085955 WatchSource:0}: Error finding container 14dc48bac2393ff2e01290b3390794e144c573bdb8333a562f1199f26a085955: Status 404 returned error can't find the container with id 14dc48bac2393ff2e01290b3390794e144c573bdb8333a562f1199f26a085955 Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.772196 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.772591 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.272580678 +0000 UTC m=+142.083796787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: W1205 20:14:02.802019 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod107b25d3_8a8d_4e82_9a93_d81251e8ff8d.slice/crio-56f46881731b4368f480da412139e5d88c3a7a1051d4e4344846f3e1f40a1eba WatchSource:0}: Error finding container 56f46881731b4368f480da412139e5d88c3a7a1051d4e4344846f3e1f40a1eba: Status 404 returned error can't find the container with id 56f46881731b4368f480da412139e5d88c3a7a1051d4e4344846f3e1f40a1eba Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.873232 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.873442 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.373401674 +0000 UTC m=+142.184617783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.873730 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.874201 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.374175046 +0000 UTC m=+142.185391145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: W1205 20:14:02.884857 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b6266d_9700_4622_9878_86ccda978b95.slice/crio-ae89c53ca54c8372f068201a9139e5ed1348f32e342ba0c6ad3678bc3ef29816 WatchSource:0}: Error finding container ae89c53ca54c8372f068201a9139e5ed1348f32e342ba0c6ad3678bc3ef29816: Status 404 returned error can't find the container with id ae89c53ca54c8372f068201a9139e5ed1348f32e342ba0c6ad3678bc3ef29816 Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.965924 4904 generic.go:334] "Generic (PLEG): container finished" podID="1d4d3dc1-19e8-4648-907c-ede5dd5e107e" containerID="643ee76c5b06b2b36006988fcf8d22a7fe2dcdb95059bceeb5cabcf90bd820ca" exitCode=0 Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.966274 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" event={"ID":"1d4d3dc1-19e8-4648-907c-ede5dd5e107e","Type":"ContainerDied","Data":"643ee76c5b06b2b36006988fcf8d22a7fe2dcdb95059bceeb5cabcf90bd820ca"} Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.974846 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:02 crc kubenswrapper[4904]: E1205 20:14:02.975531 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.475515968 +0000 UTC m=+142.286732077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.981554 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" event={"ID":"6aa0efa0-bd09-4388-b42c-11550e28712e","Type":"ContainerStarted","Data":"8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b"} Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.982394 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.984448 4904 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5srbq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.984495 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.986168 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" event={"ID":"d7b6266d-9700-4622-9878-86ccda978b95","Type":"ContainerStarted","Data":"ae89c53ca54c8372f068201a9139e5ed1348f32e342ba0c6ad3678bc3ef29816"} Dec 05 20:14:02 crc kubenswrapper[4904]: I1205 20:14:02.992699 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" event={"ID":"fe343c2a-87f6-45d4-a91d-3f86b9b5029b","Type":"ContainerStarted","Data":"14dc48bac2393ff2e01290b3390794e144c573bdb8333a562f1199f26a085955"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:02.996561 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" event={"ID":"6b6ea474-55bf-4540-a7b4-cdf769dd5d23","Type":"ContainerStarted","Data":"4bc98bca68b125d99c3bc1e082a6df02e31d87e804cab84b3f6826fdb830ea1a"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:02.998522 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z6nb9" event={"ID":"a9a77bf3-5405-4051-abed-850371e2cfcc","Type":"ContainerStarted","Data":"2e431845592f9c781d9113edd12fdccd62abe7871ab5f3f78eea9232cdcb2c2f"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.001568 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tkgxl" event={"ID":"4589bc8e-2864-4d7e-a049-a3bf264bb997","Type":"ContainerStarted","Data":"b5def2e0e8ec2ae298c974edf469289fb76f97a9a5d74139986807cdb623deed"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.007257 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" event={"ID":"4702bfcc-3b70-4600-8e4f-9137016423a6","Type":"ContainerStarted","Data":"241dc765bf0b3ade39dabec3c337929433b85cb8475b1b3615c6f73841457920"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.024037 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" event={"ID":"3cb54914-08dc-4044-af38-9375669d5a2a","Type":"ContainerStarted","Data":"e913410ad3fe6ac5abe37a709e5310000375095d13f73a8978807a528904511a"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.026180 4904 generic.go:334] "Generic (PLEG): container finished" podID="954b7859-3b75-48f2-b1ed-d70932da35ba" containerID="35da78207cc8bbba8779dda95115a399bafb70ecb93b5aa10fb43a39423e8798" exitCode=0 Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.028800 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" event={"ID":"954b7859-3b75-48f2-b1ed-d70932da35ba","Type":"ContainerDied","Data":"35da78207cc8bbba8779dda95115a399bafb70ecb93b5aa10fb43a39423e8798"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.028826 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" event={"ID":"954b7859-3b75-48f2-b1ed-d70932da35ba","Type":"ContainerStarted","Data":"bc1bee779ecf63cd94cd78e2f4302710b973d6bc121cbe58c5ab9ffd43954867"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.030393 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.057591 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tkgxl" podStartSLOduration=121.057566293 podStartE2EDuration="2m1.057566293s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.051474237 +0000 UTC m=+141.862690366" watchObservedRunningTime="2025-12-05 20:14:03.057566293 +0000 UTC m=+141.868782402" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.061084 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" event={"ID":"11ca5edb-7664-4e63-a9e8-46f270623ad2","Type":"ContainerStarted","Data":"8333cb9b80ee15546cd4b7ce613ad2ff835b208960eca25a677230cfe722b59c"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.063849 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" event={"ID":"64993a4f-0592-4a1e-93e5-def54e2868ac","Type":"ContainerStarted","Data":"6f2ed44a59bf22bce112a816d4d1069dbbd29b089280aed7d0fdcf1a8d000dda"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.076253 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" event={"ID":"d8f5a146-90e5-47e2-a639-78a09eb00231","Type":"ContainerStarted","Data":"d6ab658002c7c5f6f9f2530c2b62200a491d72ff7650009541d35f71beb87663"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.076735 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.079124 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.579105934 +0000 UTC m=+142.390322103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.087225 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" event={"ID":"107b25d3-8a8d-4e82-9a93-d81251e8ff8d","Type":"ContainerStarted","Data":"56f46881731b4368f480da412139e5d88c3a7a1051d4e4344846f3e1f40a1eba"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.111560 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" podStartSLOduration=121.111541899 podStartE2EDuration="2m1.111541899s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.109606573 +0000 UTC m=+141.920822692" watchObservedRunningTime="2025-12-05 20:14:03.111541899 +0000 UTC m=+141.922758008" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.136324 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c67m2" event={"ID":"86e5e62d-13a5-42bf-91ad-759b8be371f8","Type":"ContainerStarted","Data":"c10005cfcc8aa127b0f00e20ca43d8598712d838221a0af93682fdac597655cf"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.151536 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t22nt" podStartSLOduration=121.151517931 podStartE2EDuration="2m1.151517931s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.149559974 +0000 UTC m=+141.960776083" watchObservedRunningTime="2025-12-05 20:14:03.151517931 +0000 UTC m=+141.962734060" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.162205 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" event={"ID":"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68","Type":"ContainerStarted","Data":"fc46a78b2952ed56578fc9e52b3638fcdff71d357c2f296460a6fb47d9ce9c9f"} Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.178600 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.678585302 +0000 UTC m=+142.489801411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.178518 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.178903 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.180227 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.680211618 +0000 UTC m=+142.491427727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.181504 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" event={"ID":"efb530a0-f68c-4664-81bf-32871d3b8259","Type":"ContainerStarted","Data":"bdf5161869a75eca8fa4a4e392fd47da2a2673d8721c3a5116364fd4fbf09daf"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.181535 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" event={"ID":"efb530a0-f68c-4664-81bf-32871d3b8259","Type":"ContainerStarted","Data":"00d598a78667883ed5fd00db70c5f05ce9070490aab97010c65a9e6917454e13"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.208116 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" podStartSLOduration=122.208101782 podStartE2EDuration="2m2.208101782s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.20489124 +0000 UTC m=+142.016107359" watchObservedRunningTime="2025-12-05 20:14:03.208101782 +0000 UTC m=+142.019317891" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.244798 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" event={"ID":"93be2498-3695-49d6-8bf4-ebea6010e925","Type":"ContainerStarted","Data":"157f98cd97a18257573a56f49ba9f43a20fecdb5021a6c74b0fd63a80b6f70f6"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.285777 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" event={"ID":"fa95c779-64a5-468d-99b1-5f2307031409","Type":"ContainerStarted","Data":"0699b0bd60cc6259bb4ef235173f9c16301cc3ba8eac017307e4ca49ae5fce88"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.291725 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-888hj" podStartSLOduration=121.291708202 podStartE2EDuration="2m1.291708202s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.244180772 +0000 UTC m=+142.055396911" watchObservedRunningTime="2025-12-05 20:14:03.291708202 +0000 UTC m=+142.102924311" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.294598 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.295243 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" event={"ID":"9c314cca-9c67-4653-aa53-cdef478fabc2","Type":"ContainerStarted","Data":"63aa9271fff92997f8b5c06e1978b1bd20c23a130cc58104620a5282998e2039"} Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.295536 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.795517692 +0000 UTC m=+142.606733801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.369506 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cpsd4" podStartSLOduration=121.369455433 podStartE2EDuration="2m1.369455433s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.285514984 +0000 UTC m=+142.096731123" watchObservedRunningTime="2025-12-05 20:14:03.369455433 +0000 UTC m=+142.180671532" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.388931 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdnb5" event={"ID":"b02e39c5-31b4-4444-a500-cd7cbe327bec","Type":"ContainerStarted","Data":"1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.396455 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" event={"ID":"f11bb19b-830b-4041-88ad-72df248ff8d1","Type":"ContainerStarted","Data":"b4cff67feb7aa7b16c358e41af3a8dba75f6f0727e8901d0841efdf9a991cd29"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.397968 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" event={"ID":"d61a48cd-8429-43a1-a7ef-50d57f4b397f","Type":"ContainerStarted","Data":"8fee3898a8542ee7a39494bc770a2b68ad3d203317f08cf89b0a27b3b38cdd57"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.410031 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-92b57" podStartSLOduration=121.410015242 podStartE2EDuration="2m1.410015242s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.365357845 +0000 UTC m=+142.176573964" watchObservedRunningTime="2025-12-05 20:14:03.410015242 +0000 UTC m=+142.221231351" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.410273 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.410595 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:03.910579939 +0000 UTC m=+142.721796048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.418754 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" event={"ID":"9032a5cf-b023-4e91-bc39-52fdd93472ca","Type":"ContainerStarted","Data":"152a07712918ec22cc5144c3ba24df864d36da6cbf5a824aa425a9dc091120dc"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.434756 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pdnb5" podStartSLOduration=122.434668823 podStartE2EDuration="2m2.434668823s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.430960076 +0000 UTC m=+142.242176205" watchObservedRunningTime="2025-12-05 20:14:03.434668823 +0000 UTC m=+142.245884932" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.435264 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" podStartSLOduration=121.4352564 podStartE2EDuration="2m1.4352564s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:03.41129794 +0000 UTC m=+142.222514059" watchObservedRunningTime="2025-12-05 20:14:03.4352564 +0000 UTC m=+142.246472509" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.473401 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" event={"ID":"c0a3c4fb-7c70-437e-8d05-7104879b59c9","Type":"ContainerStarted","Data":"cdcd359e1088be85afd84a4ec39dae9c9370fc43c589a1a97645a40fa8aa00bf"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.473455 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" event={"ID":"c0a3c4fb-7c70-437e-8d05-7104879b59c9","Type":"ContainerStarted","Data":"4f2a0b08c09f88a95cde6454521eef219a74ff20503cf142b333bab4b9a16a3e"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.483555 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-672dp" event={"ID":"a70ee695-4cd0-4ad2-926f-4850e19e480f","Type":"ContainerStarted","Data":"d5267fbf434bc37ba6753b5b8fbdc696586e0ca05124541d9a05315ebcc069e2"} Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.488894 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrtsk" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.500848 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4bbwg" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.515653 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.517432 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.017411729 +0000 UTC m=+142.828627838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.665808 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.165796756 +0000 UTC m=+142.977012865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.700154 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.802957 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.803089 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.303039392 +0000 UTC m=+143.114255501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.803249 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.803670 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.303651129 +0000 UTC m=+143.114867238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.904594 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:03 crc kubenswrapper[4904]: E1205 20:14:03.905011 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.40498993 +0000 UTC m=+143.216206039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.926989 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.944131 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:03 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:03 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:03 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:03 crc kubenswrapper[4904]: I1205 20:14:03.944173 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.008694 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.009100 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.509086701 +0000 UTC m=+143.320302810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.123280 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.123617 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.623603212 +0000 UTC m=+143.434819321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.249011 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.249988 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.749973414 +0000 UTC m=+143.561189523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.351082 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.351419 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.851403578 +0000 UTC m=+143.662619687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.454192 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.454532 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:04.95451888 +0000 UTC m=+143.765734999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.496097 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" event={"ID":"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68","Type":"ContainerStarted","Data":"b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.500452 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.511900 4904 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-s87dr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.39:6443/healthz\": dial tcp 10.217.0.39:6443: connect: connection refused" start-of-body= Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.511954 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.39:6443/healthz\": dial tcp 10.217.0.39:6443: connect: connection refused" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.529288 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" event={"ID":"95df99ca-9a0a-465c-96b6-968c0a01235a","Type":"ContainerStarted","Data":"3e3154462a241ed95ecc9dbb4e04b7d7cd4b56c3cbdd19383a96cfe301ecb37b"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.532214 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" event={"ID":"9b9f12dc-0021-4479-9b7e-60da4e3f27b0","Type":"ContainerStarted","Data":"6bed346745bb8ef006a43dcffcffd0aa670efb62542cb0cc150895d9f2e5d7fc"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.537347 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" event={"ID":"d8f5a146-90e5-47e2-a639-78a09eb00231","Type":"ContainerStarted","Data":"7f8141994963d9b94345c55dbd045be2ef544d0928b51fb3dd7567ff4c83a4f6"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.561112 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.562598 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.062567935 +0000 UTC m=+143.873784044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.565295 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" event={"ID":"834a438d-9f7d-4707-b790-8ce136081f7c","Type":"ContainerStarted","Data":"acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.565842 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.567836 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-672dp" event={"ID":"a70ee695-4cd0-4ad2-926f-4850e19e480f","Type":"ContainerStarted","Data":"ee642d168f4ac4ca4eb06ffebaffb7c49968e66c637c94a80e78ead6d5e43225"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.569150 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.569209 4904 patch_prober.go:28] interesting pod/downloads-7954f5f757-672dp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.569236 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-672dp" podUID="a70ee695-4cd0-4ad2-926f-4850e19e480f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.583187 4904 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cncqb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.583252 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" podUID="834a438d-9f7d-4707-b790-8ce136081f7c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.584488 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" podStartSLOduration=122.584473497 podStartE2EDuration="2m2.584473497s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.526532417 +0000 UTC m=+143.337748536" watchObservedRunningTime="2025-12-05 20:14:04.584473497 +0000 UTC m=+143.395689596" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.585046 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz8jl" podStartSLOduration=123.585041693 podStartE2EDuration="2m3.585041693s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.582400227 +0000 UTC m=+143.393616336" watchObservedRunningTime="2025-12-05 20:14:04.585041693 +0000 UTC m=+143.396257802" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.618089 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" podStartSLOduration=122.618072236 podStartE2EDuration="2m2.618072236s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.616765257 +0000 UTC m=+143.427981376" watchObservedRunningTime="2025-12-05 20:14:04.618072236 +0000 UTC m=+143.429288345" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.621810 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" event={"ID":"c6c9df4a-394c-4a6c-9132-5fefe0ed672d","Type":"ContainerStarted","Data":"14c17b4e32774ca53aa5ac05d12dc7c67d5624dd68ed0658c5c952ca3d59c0e1"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.628393 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" event={"ID":"cd549d80-f654-4dd1-888c-a0b02d0e3afc","Type":"ContainerStarted","Data":"31fcb611a77c7d68d8323ad09d41ec4a812518017e653d06963d1344616b3f53"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.630471 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" event={"ID":"67c5324d-bb68-4bbc-a0f6-e0452d6f4155","Type":"ContainerStarted","Data":"31f677836485125a92f66d5ceeb3104a32ce4fd31e1cd929acad9c20ef84ecf9"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.630498 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" event={"ID":"67c5324d-bb68-4bbc-a0f6-e0452d6f4155","Type":"ContainerStarted","Data":"3884ca075e8b397da2de3b8dcdc7b8e95bdafef9e211ede4822b48f54650aa1d"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.632015 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" event={"ID":"c0a3c4fb-7c70-437e-8d05-7104879b59c9","Type":"ContainerStarted","Data":"a549cc05ed25b5a9665d36b0e0395434ce085cd1cbc0ae69f0d1f98f0b514752"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.653211 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" podStartSLOduration=123.653194297 podStartE2EDuration="2m3.653194297s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.652364674 +0000 UTC m=+143.463580803" watchObservedRunningTime="2025-12-05 20:14:04.653194297 +0000 UTC m=+143.464410406" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.658326 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" event={"ID":"6b6ea474-55bf-4540-a7b4-cdf769dd5d23","Type":"ContainerStarted","Data":"b223dd6ccd788580fe099dc58dd2e4bd098d49b018550dd49851257b6d7b298a"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.662578 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.663778 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.163764022 +0000 UTC m=+143.974980121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.700238 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" event={"ID":"d5093d22-5bb5-4ec0-9b73-17d82af4afd7","Type":"ContainerStarted","Data":"105d7f8d397af78a351a462df4d10a5df48973bd75d80a6d36e5812c2612928c"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.701213 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.733782 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-484jg" event={"ID":"214b71b9-85d3-4be2-be71-311e9097f89b","Type":"ContainerStarted","Data":"1377154574b8dce0d9b32d6941bacc7251be020634c1b999b99a6c7c704bfac5"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.763412 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.763566 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.263542279 +0000 UTC m=+144.074758398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.763965 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.764023 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" event={"ID":"f11bb19b-830b-4041-88ad-72df248ff8d1","Type":"ContainerStarted","Data":"53241142ffc6221ee1fe8d218b6cf317b184d00375d26685aed74c34097ddc39"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.764881 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.765110 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.265093833 +0000 UTC m=+144.076309992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.767333 4904 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jg5fm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.767386 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" podUID="f11bb19b-830b-4041-88ad-72df248ff8d1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.779909 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-672dp" podStartSLOduration=123.779890379 podStartE2EDuration="2m3.779890379s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.732517795 +0000 UTC m=+143.543733914" watchObservedRunningTime="2025-12-05 20:14:04.779890379 +0000 UTC m=+143.591106488" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.781344 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zq5jf" podStartSLOduration=122.781337751 podStartE2EDuration="2m2.781337751s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.779298602 +0000 UTC m=+143.590514731" watchObservedRunningTime="2025-12-05 20:14:04.781337751 +0000 UTC m=+143.592553860" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.797301 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" event={"ID":"4702bfcc-3b70-4600-8e4f-9137016423a6","Type":"ContainerStarted","Data":"5facefe981b2a0cf16e2fe07b345c761ae0f7a21a2cc4b8b01185658c0d5c818"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.810466 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sbdkl" event={"ID":"9c314cca-9c67-4653-aa53-cdef478fabc2","Type":"ContainerStarted","Data":"dd19065e18522b5ea6c2ac6af27f1ba798f0072291c5c6c412c20b1f6a1f81ae"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.820606 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" event={"ID":"107b25d3-8a8d-4e82-9a93-d81251e8ff8d","Type":"ContainerStarted","Data":"508023faaa38ef68f44a025024efcf303eb678660b79749288879208d2cde1c8"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.829646 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" event={"ID":"6bd0c7fa-69da-45b6-8008-8fc080ca4ba6","Type":"ContainerStarted","Data":"49cbae44e6e319a2756cf203dde68dde061c284c722188543103ca83baaabb38"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.845881 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" event={"ID":"d7b6266d-9700-4622-9878-86ccda978b95","Type":"ContainerStarted","Data":"ae25e9e84c8e7bc22a8419f259d4bea7c1d6aac2a1342e144cdf6159d36798f4"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.861506 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-479fl" podStartSLOduration=122.861488442 podStartE2EDuration="2m2.861488442s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.854992634 +0000 UTC m=+143.666208743" watchObservedRunningTime="2025-12-05 20:14:04.861488442 +0000 UTC m=+143.672704551" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.867466 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.868443 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.368427922 +0000 UTC m=+144.179644041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.894226 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" event={"ID":"9032a5cf-b023-4e91-bc39-52fdd93472ca","Type":"ContainerStarted","Data":"1423f9ee0970af1e5556bc2776909d3df1c81ecd38b76bd78590dfad4270d1d6"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.898171 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jw92f" podStartSLOduration=122.898149039 podStartE2EDuration="2m2.898149039s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.897366966 +0000 UTC m=+143.708583075" watchObservedRunningTime="2025-12-05 20:14:04.898149039 +0000 UTC m=+143.709365148" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.899249 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-brbpq" event={"ID":"0b51cbb6-b3e4-454b-a58e-086eca25cb7d","Type":"ContainerStarted","Data":"762477d2035c609e61b5d52d3d3f19994271d5d27c3c504e6aec78563810d561"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.907486 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" event={"ID":"64993a4f-0592-4a1e-93e5-def54e2868ac","Type":"ContainerStarted","Data":"f13aa9cf80f1e4915893e3b4877b1613b0f28c3398f944e08e1b79dae2e75a2a"} Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.911454 4904 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5srbq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.911508 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.933721 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" podStartSLOduration=122.933703603 podStartE2EDuration="2m2.933703603s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.932464278 +0000 UTC m=+143.743680387" watchObservedRunningTime="2025-12-05 20:14:04.933703603 +0000 UTC m=+143.744919712" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.940367 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:04 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:04 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:04 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.940421 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:04 crc kubenswrapper[4904]: E1205 20:14:04.971650 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.471627767 +0000 UTC m=+144.282843876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.983237 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzxff" podStartSLOduration=122.98319843 podStartE2EDuration="2m2.98319843s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:04.98214272 +0000 UTC m=+143.793358839" watchObservedRunningTime="2025-12-05 20:14:04.98319843 +0000 UTC m=+143.794414539" Dec 05 20:14:04 crc kubenswrapper[4904]: I1205 20:14:04.987907 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.020564 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9rm4d" podStartSLOduration=124.020545926 podStartE2EDuration="2m4.020545926s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.015086379 +0000 UTC m=+143.826302498" watchObservedRunningTime="2025-12-05 20:14:05.020545926 +0000 UTC m=+143.831762035" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.073671 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" podStartSLOduration=123.073645817 podStartE2EDuration="2m3.073645817s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.055513105 +0000 UTC m=+143.866729214" watchObservedRunningTime="2025-12-05 20:14:05.073645817 +0000 UTC m=+143.884861926" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.096718 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.097025 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.597009361 +0000 UTC m=+144.408225470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.147820 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-484jg" podStartSLOduration=124.147804145 podStartE2EDuration="2m4.147804145s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.111407216 +0000 UTC m=+143.922623335" watchObservedRunningTime="2025-12-05 20:14:05.147804145 +0000 UTC m=+143.959020254" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.200452 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.200753 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.700741611 +0000 UTC m=+144.511957720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.234613 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpnlq" podStartSLOduration=123.234597706 podStartE2EDuration="2m3.234597706s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.148300939 +0000 UTC m=+143.959517068" watchObservedRunningTime="2025-12-05 20:14:05.234597706 +0000 UTC m=+144.045813815" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.278268 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jnrj8" podStartSLOduration=123.278251575 podStartE2EDuration="2m3.278251575s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.234966887 +0000 UTC m=+144.046182996" watchObservedRunningTime="2025-12-05 20:14:05.278251575 +0000 UTC m=+144.089467684" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.309461 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.309611 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.809583178 +0000 UTC m=+144.620799287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.309857 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.310124 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.810117584 +0000 UTC m=+144.621333693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.341151 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wszp4" podStartSLOduration=123.341130808 podStartE2EDuration="2m3.341130808s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.277193104 +0000 UTC m=+144.088409223" watchObservedRunningTime="2025-12-05 20:14:05.341130808 +0000 UTC m=+144.152346927" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.341861 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" podStartSLOduration=123.341852658 podStartE2EDuration="2m3.341852658s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.33913532 +0000 UTC m=+144.150351449" watchObservedRunningTime="2025-12-05 20:14:05.341852658 +0000 UTC m=+144.153069007" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.383336 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" podStartSLOduration=123.383315274 podStartE2EDuration="2m3.383315274s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.381655985 +0000 UTC m=+144.192872104" watchObservedRunningTime="2025-12-05 20:14:05.383315274 +0000 UTC m=+144.194531393" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.411376 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.411856 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:05.911839656 +0000 UTC m=+144.723055755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.453308 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-brbpq" podStartSLOduration=8.453287971 podStartE2EDuration="8.453287971s" podCreationTimestamp="2025-12-05 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:05.419536678 +0000 UTC m=+144.230752787" watchObservedRunningTime="2025-12-05 20:14:05.453287971 +0000 UTC m=+144.264504080" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.516528 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.516913 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.016897224 +0000 UTC m=+144.828113343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.617744 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.618107 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.118093921 +0000 UTC m=+144.929310030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.719634 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.720006 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.219987078 +0000 UTC m=+145.031203187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.813637 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sbrpp" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.820788 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.820990 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.320960879 +0000 UTC m=+145.132176998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.821168 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.821529 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.321514884 +0000 UTC m=+145.132731053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.922599 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:05 crc kubenswrapper[4904]: E1205 20:14:05.923050 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.42303423 +0000 UTC m=+145.234250339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.928237 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" event={"ID":"95df99ca-9a0a-465c-96b6-968c0a01235a","Type":"ContainerStarted","Data":"fc0338da6548802278521b6beba1b7514b193ba53202303981562f4566db1946"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.930172 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" event={"ID":"fe343c2a-87f6-45d4-a91d-3f86b9b5029b","Type":"ContainerStarted","Data":"41877ba58647898a92bfdc018fe1c7b409b5d053ae9b81e8f9fcd596c97e539e"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.931201 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" event={"ID":"11ca5edb-7664-4e63-a9e8-46f270623ad2","Type":"ContainerStarted","Data":"8b6b32cd140c5bdf462ba0c58086c0735e40839306c139c82169d71a5faa38d8"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.932705 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" event={"ID":"d5093d22-5bb5-4ec0-9b73-17d82af4afd7","Type":"ContainerStarted","Data":"a9b0fb62edc9fbf587de6bdfb10d8fa8cf987faf1fb0066092eb166b40bb713c"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.938171 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z6nb9" event={"ID":"a9a77bf3-5405-4051-abed-850371e2cfcc","Type":"ContainerStarted","Data":"a4e6b98b186322e602d616f44b557e6847ae4d2fed4a9810a7119afee84a27b8"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.938233 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z6nb9" event={"ID":"a9a77bf3-5405-4051-abed-850371e2cfcc","Type":"ContainerStarted","Data":"ea56476de3c3884aec55c1828741ab37f169588cdf6b7130e213266ae6d82bb9"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.939409 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" event={"ID":"d61a48cd-8429-43a1-a7ef-50d57f4b397f","Type":"ContainerStarted","Data":"4922846e6168e748d80fd95d5e4b88983fd68cb0b2362d97b6c9f7d6999e7dc8"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.941271 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5xch" event={"ID":"d7b6266d-9700-4622-9878-86ccda978b95","Type":"ContainerStarted","Data":"a0a016e6bd08b3822481dcf38e228bc06bb1ad2019e9d93fb0d506a2ac8e9675"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.944022 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" event={"ID":"1d4d3dc1-19e8-4648-907c-ede5dd5e107e","Type":"ContainerStarted","Data":"73321ade5b18281cc09c5d5d66f916c00a8c70af7241592fd8b147d93d5dd79e"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.959430 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:05 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:05 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:05 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.959490 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.959822 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" event={"ID":"c6c9df4a-394c-4a6c-9132-5fefe0ed672d","Type":"ContainerStarted","Data":"529070e2d270d0110d29b6381d767dd38b28f102c2d754135726d26315e00276"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.961827 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c67m2" event={"ID":"86e5e62d-13a5-42bf-91ad-759b8be371f8","Type":"ContainerStarted","Data":"653f3ee990f51ae0b3f96fb0e8780c35d0f48c6cf50df58e5a7ff588c05ac1ef"} Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.963680 4904 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5srbq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.963731 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.963738 4904 patch_prober.go:28] interesting pod/downloads-7954f5f757-672dp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.963780 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-672dp" podUID="a70ee695-4cd0-4ad2-926f-4850e19e480f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.964720 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:14:05 crc kubenswrapper[4904]: I1205 20:14:05.994207 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.034566 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.036120 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.53610697 +0000 UTC m=+145.347323079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.089749 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qmfqn" podStartSLOduration=124.089732766 podStartE2EDuration="2m4.089732766s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:06.088028417 +0000 UTC m=+144.899244536" watchObservedRunningTime="2025-12-05 20:14:06.089732766 +0000 UTC m=+144.900948875" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.097687 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jg5fm" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.139142 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.139256 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.639231003 +0000 UTC m=+145.450447122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.139371 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.139908 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.639889352 +0000 UTC m=+145.451105461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.242827 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.243719 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.743700464 +0000 UTC m=+145.554916573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.318771 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" podStartSLOduration=124.318756298 podStartE2EDuration="2m4.318756298s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:06.308542563 +0000 UTC m=+145.119758682" watchObservedRunningTime="2025-12-05 20:14:06.318756298 +0000 UTC m=+145.129972407" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.362940 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.363353 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.863340143 +0000 UTC m=+145.674556252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.447228 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.464034 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.464236 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.964204351 +0000 UTC m=+145.775420460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.464603 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.464979 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:06.964971133 +0000 UTC m=+145.776187242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.566263 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.566418 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.066391006 +0000 UTC m=+145.877607115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.566523 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.566934 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.066920531 +0000 UTC m=+145.878136640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.667967 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.668350 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.168336535 +0000 UTC m=+145.979552644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.769830 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.770329 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.270314194 +0000 UTC m=+146.081530303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.786212 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c67m2" podStartSLOduration=9.786191262 podStartE2EDuration="9.786191262s" podCreationTimestamp="2025-12-05 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:06.610309772 +0000 UTC m=+145.421525881" watchObservedRunningTime="2025-12-05 20:14:06.786191262 +0000 UTC m=+145.597407371" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.872621 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.872873 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.37284345 +0000 UTC m=+146.184059559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.873162 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.873477 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.373465978 +0000 UTC m=+146.184682087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.892559 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x2kf" podStartSLOduration=124.892542458 podStartE2EDuration="2m4.892542458s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:06.788147328 +0000 UTC m=+145.599363437" watchObservedRunningTime="2025-12-05 20:14:06.892542458 +0000 UTC m=+145.703758567" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.949685 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:06 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:06 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:06 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.949772 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.960048 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-l928g" podStartSLOduration=124.960019783 podStartE2EDuration="2m4.960019783s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:06.924508599 +0000 UTC m=+145.735724708" watchObservedRunningTime="2025-12-05 20:14:06.960019783 +0000 UTC m=+145.771235882" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.963423 4904 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zdlcn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.963490 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" podUID="64993a4f-0592-4a1e-93e5-def54e2868ac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.963580 4904 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-s87dr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.39:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.963599 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.39:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 20:14:06 crc kubenswrapper[4904]: I1205 20:14:06.973900 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:06 crc kubenswrapper[4904]: E1205 20:14:06.974279 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.474261463 +0000 UTC m=+146.285477572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.016820 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" podStartSLOduration=125.016800139 podStartE2EDuration="2m5.016800139s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:06.961553716 +0000 UTC m=+145.772769825" watchObservedRunningTime="2025-12-05 20:14:07.016800139 +0000 UTC m=+145.828016248" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.018209 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z6nb9" podStartSLOduration=10.018204949 podStartE2EDuration="10.018204949s" podCreationTimestamp="2025-12-05 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:07.015728799 +0000 UTC m=+145.826944908" watchObservedRunningTime="2025-12-05 20:14:07.018204949 +0000 UTC m=+145.829421058" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.079653 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.080194 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.580175396 +0000 UTC m=+146.391391505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.115710 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" event={"ID":"fe343c2a-87f6-45d4-a91d-3f86b9b5029b","Type":"ContainerStarted","Data":"dd2967f68b2f39191ad062c46f1f1cb2e214b8e2eac59995db73daedec8aab8b"} Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.115756 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" event={"ID":"fe343c2a-87f6-45d4-a91d-3f86b9b5029b","Type":"ContainerStarted","Data":"a566fdbf4e6f95f00e65d317db02fc9db1532a586b89df9966e201f719662018"} Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.121295 4904 patch_prober.go:28] interesting pod/downloads-7954f5f757-672dp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.121346 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-672dp" podUID="a70ee695-4cd0-4ad2-926f-4850e19e480f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.130249 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdlcn" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.181434 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.181992 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.68195104 +0000 UTC m=+146.493167149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.287613 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.288887 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.788867461 +0000 UTC m=+146.600083660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.297933 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.388633 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.388930 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.888916226 +0000 UTC m=+146.700132335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.490014 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.490503 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:07.990486533 +0000 UTC m=+146.801702642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.592390 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.592587 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:08.092554825 +0000 UTC m=+146.903770944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.592767 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.593108 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:08.093097981 +0000 UTC m=+146.904314100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.692601 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gpt4n"] Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.693596 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.693789 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:08.193756113 +0000 UTC m=+147.004972222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.693949 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.694119 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.694364 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:08.19435581 +0000 UTC m=+147.005571919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: W1205 20:14:07.696806 4904 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.696844 4904 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.717575 4904 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.742487 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpt4n"] Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.795348 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.795573 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:14:08.295541317 +0000 UTC m=+147.106757436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.795647 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zpj\" (UniqueName: \"kubernetes.io/projected/d5752678-2a85-49a8-b6d4-63b4adb96277-kube-api-access-g8zpj\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.795849 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.795959 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-utilities\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.796043 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-catalog-content\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: E1205 20:14:07.796281 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:14:08.296264817 +0000 UTC m=+147.107480926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hdk86" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.834391 4904 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T20:14:07.71759585Z","Handler":null,"Name":""} Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.865090 4904 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.865127 4904 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.900261 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.900571 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-utilities\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.900633 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-catalog-content\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.900664 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zpj\" (UniqueName: \"kubernetes.io/projected/d5752678-2a85-49a8-b6d4-63b4adb96277-kube-api-access-g8zpj\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.901540 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-catalog-content\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.901542 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-utilities\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.910570 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vjj47"] Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.911697 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.922900 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.929759 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:07 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:07 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:07 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.929823 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.958132 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:14:07 crc kubenswrapper[4904]: I1205 20:14:07.977594 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zpj\" (UniqueName: \"kubernetes.io/projected/d5752678-2a85-49a8-b6d4-63b4adb96277-kube-api-access-g8zpj\") pod \"community-operators-gpt4n\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.001677 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pwcg\" (UniqueName: \"kubernetes.io/projected/9cff387b-9124-46db-8d06-5cb5839e0a12-kube-api-access-2pwcg\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.001743 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.001777 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-utilities\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.001804 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-catalog-content\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.026948 4904 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.026984 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.077169 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-plmk6"] Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.078461 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.080341 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vjj47"] Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.104705 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pwcg\" (UniqueName: \"kubernetes.io/projected/9cff387b-9124-46db-8d06-5cb5839e0a12-kube-api-access-2pwcg\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.104796 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-utilities\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.104821 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-catalog-content\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.105283 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-utilities\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.105359 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-catalog-content\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.139967 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" event={"ID":"fe343c2a-87f6-45d4-a91d-3f86b9b5029b","Type":"ContainerStarted","Data":"067c80010197678e677a2f301f0d9e2c8bae980bc3d9988379ac5001fe2b4e54"} Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.184973 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pwcg\" (UniqueName: \"kubernetes.io/projected/9cff387b-9124-46db-8d06-5cb5839e0a12-kube-api-access-2pwcg\") pod \"certified-operators-vjj47\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.206433 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-catalog-content\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.206490 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7gj\" (UniqueName: \"kubernetes.io/projected/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-kube-api-access-vl7gj\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.206602 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-utilities\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.219631 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hdk86\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.221417 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" podStartSLOduration=11.221403922 podStartE2EDuration="11.221403922s" podCreationTimestamp="2025-12-05 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:08.220351512 +0000 UTC m=+147.031567631" watchObservedRunningTime="2025-12-05 20:14:08.221403922 +0000 UTC m=+147.032620041" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.226079 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.249832 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g4tfw"] Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.261731 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plmk6"] Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.261877 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.285836 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g4tfw"] Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.313090 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-catalog-content\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.313160 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7gj\" (UniqueName: \"kubernetes.io/projected/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-kube-api-access-vl7gj\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.313517 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-utilities\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.315882 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-catalog-content\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.317467 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-utilities\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.367269 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7gj\" (UniqueName: \"kubernetes.io/projected/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-kube-api-access-vl7gj\") pod \"community-operators-plmk6\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.393514 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.434838 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-catalog-content\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.435252 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjhg\" (UniqueName: \"kubernetes.io/projected/3aaaf574-a341-4efb-beae-169d69ca32d4-kube-api-access-wpjhg\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.435377 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-utilities\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.537450 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-catalog-content\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.537888 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjhg\" (UniqueName: \"kubernetes.io/projected/3aaaf574-a341-4efb-beae-169d69ca32d4-kube-api-access-wpjhg\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.537928 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-utilities\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.538653 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-utilities\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.538906 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-catalog-content\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.568111 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjhg\" (UniqueName: \"kubernetes.io/projected/3aaaf574-a341-4efb-beae-169d69ca32d4-kube-api-access-wpjhg\") pod \"certified-operators-g4tfw\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.588420 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.887240 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vjj47"] Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.930459 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:08 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:08 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:08 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.930508 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.988639 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.990193 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:14:08 crc kubenswrapper[4904]: I1205 20:14:08.997550 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.056852 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.056899 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.056935 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.056954 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.064374 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.065022 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.070776 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.076775 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.103149 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hdk86"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.174086 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerStarted","Data":"761b868471a5292b238d3864d425c1886e65246338f19b02771ecfeee8ffb7a5"} Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.202398 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.212783 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.223686 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.248606 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g4tfw"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.427430 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.428133 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.433726 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.433948 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.452793 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.499709 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.500230 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.506139 4904 patch_prober.go:28] interesting pod/console-f9d7485db-pdnb5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.506193 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pdnb5" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.508344 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpt4n"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.524080 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.524385 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.538466 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.538502 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.538568 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.557926 4904 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bvmh2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]log ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]etcd ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/max-in-flight-filter ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 20:14:09 crc kubenswrapper[4904]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 20:14:09 crc kubenswrapper[4904]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/openshift.io-startinformers ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 20:14:09 crc kubenswrapper[4904]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 20:14:09 crc kubenswrapper[4904]: livez check failed Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.557958 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" podUID="c6c9df4a-394c-4a6c-9132-5fefe0ed672d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.566927 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.566977 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.597199 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plmk6"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.635043 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vd9gt"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.636336 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.643518 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.643618 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd9gt"] Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.668519 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.668611 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.674309 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.696083 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.713845 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.770780 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnmw9\" (UniqueName: \"kubernetes.io/projected/757462be-80d7-44c2-a193-48f78ac5b80e-kube-api-access-wnmw9\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.770846 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-utilities\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.770892 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-catalog-content\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.828396 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.873028 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnmw9\" (UniqueName: \"kubernetes.io/projected/757462be-80d7-44c2-a193-48f78ac5b80e-kube-api-access-wnmw9\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.873147 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-utilities\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.873202 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-catalog-content\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.873709 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-catalog-content\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.874295 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-utilities\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: W1205 20:14:09.875257 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-1f1f62216481cdd419650b95159d866c6e464b834b92a887d6513026cc2ad58e WatchSource:0}: Error finding container 1f1f62216481cdd419650b95159d866c6e464b834b92a887d6513026cc2ad58e: Status 404 returned error can't find the container with id 1f1f62216481cdd419650b95159d866c6e464b834b92a887d6513026cc2ad58e Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.910878 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnmw9\" (UniqueName: \"kubernetes.io/projected/757462be-80d7-44c2-a193-48f78ac5b80e-kube-api-access-wnmw9\") pod \"redhat-marketplace-vd9gt\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.926570 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.929981 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:09 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:09 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:09 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.930077 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:09 crc kubenswrapper[4904]: I1205 20:14:09.999407 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.020787 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sr8r7"] Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.021848 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.023162 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.033226 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sr8r7"] Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.179512 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghsvd\" (UniqueName: \"kubernetes.io/projected/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-kube-api-access-ghsvd\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.179869 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-catalog-content\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.179942 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-utilities\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.181458 4904 generic.go:334] "Generic (PLEG): container finished" podID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerID="db89cfbfcabbe0200a4a253b56413e0a972accdb4b58d4ac264251899c28d6d5" exitCode=0 Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.181534 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4tfw" event={"ID":"3aaaf574-a341-4efb-beae-169d69ca32d4","Type":"ContainerDied","Data":"db89cfbfcabbe0200a4a253b56413e0a972accdb4b58d4ac264251899c28d6d5"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.181566 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4tfw" event={"ID":"3aaaf574-a341-4efb-beae-169d69ca32d4","Type":"ContainerStarted","Data":"9545307ffec1bda8606b993f7ddd633ccec45fb01e3dfc75013fd7b3fb805dfa"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.183635 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.185877 4904 generic.go:334] "Generic (PLEG): container finished" podID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerID="fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec" exitCode=0 Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.185933 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerDied","Data":"fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.189004 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"14c8315729b786ac2e4f0956ffbea7535a637dbdc766a55dcbdfa81b8e4979e0"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.190957 4904 generic.go:334] "Generic (PLEG): container finished" podID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerID="67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f" exitCode=0 Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.191009 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpt4n" event={"ID":"d5752678-2a85-49a8-b6d4-63b4adb96277","Type":"ContainerDied","Data":"67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.191034 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpt4n" event={"ID":"d5752678-2a85-49a8-b6d4-63b4adb96277","Type":"ContainerStarted","Data":"9eabb38cb62231225654c17f80fdc3d5f0d0ccfbbab74c2f46ceea3a1e95bee7"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.192120 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" event={"ID":"5b1664bf-b83a-4582-8018-ec55e02a4068","Type":"ContainerStarted","Data":"ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.192139 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" event={"ID":"5b1664bf-b83a-4582-8018-ec55e02a4068","Type":"ContainerStarted","Data":"36bf9e89dcf50cea05451a5e50f6b0416e5b74037ca069b26237b7aa023a5e7b"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.192659 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.202734 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b6b548ca58b059a5c5e62eefc7a26aa3f1d78d8d58a3ee9f85a6f1bc504f5c92"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.228591 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" podStartSLOduration=128.228572659 podStartE2EDuration="2m8.228572659s" podCreationTimestamp="2025-12-05 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:10.223447882 +0000 UTC m=+149.034664011" watchObservedRunningTime="2025-12-05 20:14:10.228572659 +0000 UTC m=+149.039788768" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.241437 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plmk6" event={"ID":"9ef986b1-43e3-4c24-87d0-f385d1ae80bc","Type":"ContainerStarted","Data":"d40ba7389acd9ba88690544b173451d098a0f047d7b9d1e8ae942d479cff1bad"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.254638 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f1f62216481cdd419650b95159d866c6e464b834b92a887d6513026cc2ad58e"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.280908 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghsvd\" (UniqueName: \"kubernetes.io/projected/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-kube-api-access-ghsvd\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.280968 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-catalog-content\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.281042 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-utilities\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.285762 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-catalog-content\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.286783 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-utilities\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.289993 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.320770 4904 generic.go:334] "Generic (PLEG): container finished" podID="d8f5a146-90e5-47e2-a639-78a09eb00231" containerID="7f8141994963d9b94345c55dbd045be2ef544d0928b51fb3dd7567ff4c83a4f6" exitCode=0 Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.320944 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" event={"ID":"d8f5a146-90e5-47e2-a639-78a09eb00231","Type":"ContainerDied","Data":"7f8141994963d9b94345c55dbd045be2ef544d0928b51fb3dd7567ff4c83a4f6"} Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.324816 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghsvd\" (UniqueName: \"kubernetes.io/projected/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-kube-api-access-ghsvd\") pod \"redhat-marketplace-sr8r7\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.329833 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vw7kx" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.352599 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:14:10 crc kubenswrapper[4904]: W1205 20:14:10.401908 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddf234b70_dbe9_49e0_bc5c_ba8078c086cf.slice/crio-d2d0bfd7f9f79bd9d01c8c3ce035c3d703a24ab49c510ed702e0e69d902e475f WatchSource:0}: Error finding container d2d0bfd7f9f79bd9d01c8c3ce035c3d703a24ab49c510ed702e0e69d902e475f: Status 404 returned error can't find the container with id d2d0bfd7f9f79bd9d01c8c3ce035c3d703a24ab49c510ed702e0e69d902e475f Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.422186 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd9gt"] Dec 05 20:14:10 crc kubenswrapper[4904]: W1205 20:14:10.436626 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757462be_80d7_44c2_a193_48f78ac5b80e.slice/crio-bf05d4f7d0ccdc3c6a204bdf693d7526c24fd6929d81ee9728df586ffd030e1a WatchSource:0}: Error finding container bf05d4f7d0ccdc3c6a204bdf693d7526c24fd6929d81ee9728df586ffd030e1a: Status 404 returned error can't find the container with id bf05d4f7d0ccdc3c6a204bdf693d7526c24fd6929d81ee9728df586ffd030e1a Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.503724 4904 patch_prober.go:28] interesting pod/downloads-7954f5f757-672dp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.504110 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-672dp" podUID="a70ee695-4cd0-4ad2-926f-4850e19e480f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.503839 4904 patch_prober.go:28] interesting pod/downloads-7954f5f757-672dp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.504495 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-672dp" podUID="a70ee695-4cd0-4ad2-926f-4850e19e480f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.825961 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jrg9c"] Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.827157 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.829331 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.837002 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrg9c"] Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.890865 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9z6g\" (UniqueName: \"kubernetes.io/projected/59e498ac-1ef4-4984-956a-f1d7614d5dba-kube-api-access-p9z6g\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.890962 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-catalog-content\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.891003 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-utilities\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.923502 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sr8r7"] Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.931295 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:10 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:10 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:10 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.931336 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:10 crc kubenswrapper[4904]: W1205 20:14:10.932798 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea52ad5_455a_4f4b_a576_fa76c35bcbea.slice/crio-0f0a396484963c3ad377880a609ba618648358d34310c7a5ad7b34ccf37082c5 WatchSource:0}: Error finding container 0f0a396484963c3ad377880a609ba618648358d34310c7a5ad7b34ccf37082c5: Status 404 returned error can't find the container with id 0f0a396484963c3ad377880a609ba618648358d34310c7a5ad7b34ccf37082c5 Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.992130 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-catalog-content\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.992181 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-utilities\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.992218 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9z6g\" (UniqueName: \"kubernetes.io/projected/59e498ac-1ef4-4984-956a-f1d7614d5dba-kube-api-access-p9z6g\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.992821 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-catalog-content\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:10 crc kubenswrapper[4904]: I1205 20:14:10.993103 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-utilities\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.031708 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9z6g\" (UniqueName: \"kubernetes.io/projected/59e498ac-1ef4-4984-956a-f1d7614d5dba-kube-api-access-p9z6g\") pod \"redhat-operators-jrg9c\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.183029 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.216584 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-87qk8"] Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.217898 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.231249 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87qk8"] Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.302003 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-catalog-content\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.302072 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24f9\" (UniqueName: \"kubernetes.io/projected/ed3e3607-e411-47d4-8a20-cbd6623ec89e-kube-api-access-d24f9\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.302093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-utilities\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.345310 4904 generic.go:334] "Generic (PLEG): container finished" podID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerID="5bfd10e848c36742e6a05d12f89ad1b0b1a399b6abaefa93658d5a28398d8f64" exitCode=0 Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.345444 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plmk6" event={"ID":"9ef986b1-43e3-4c24-87d0-f385d1ae80bc","Type":"ContainerDied","Data":"5bfd10e848c36742e6a05d12f89ad1b0b1a399b6abaefa93658d5a28398d8f64"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.357038 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e652c3ae30bbf3afa0f38ac80c2717f790e934174b2178ed852c06d7065c3a8"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.357797 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.368243 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df234b70-dbe9-49e0-bc5c-ba8078c086cf","Type":"ContainerStarted","Data":"fde0f8c2a12f543c74a99b05f803a472ca3c26f1ca3da44a6ee26532d11e329b"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.368280 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df234b70-dbe9-49e0-bc5c-ba8078c086cf","Type":"ContainerStarted","Data":"d2d0bfd7f9f79bd9d01c8c3ce035c3d703a24ab49c510ed702e0e69d902e475f"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.373083 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e54f846a74105384bccb1988380613a6adc958580737b5a973485fc1993aae5d"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.381665 4904 generic.go:334] "Generic (PLEG): container finished" podID="757462be-80d7-44c2-a193-48f78ac5b80e" containerID="532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97" exitCode=0 Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.381807 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd9gt" event={"ID":"757462be-80d7-44c2-a193-48f78ac5b80e","Type":"ContainerDied","Data":"532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.381871 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd9gt" event={"ID":"757462be-80d7-44c2-a193-48f78ac5b80e","Type":"ContainerStarted","Data":"bf05d4f7d0ccdc3c6a204bdf693d7526c24fd6929d81ee9728df586ffd030e1a"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.390656 4904 generic.go:334] "Generic (PLEG): container finished" podID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerID="ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7" exitCode=0 Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.390736 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sr8r7" event={"ID":"2ea52ad5-455a-4f4b-a576-fa76c35bcbea","Type":"ContainerDied","Data":"ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.390762 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sr8r7" event={"ID":"2ea52ad5-455a-4f4b-a576-fa76c35bcbea","Type":"ContainerStarted","Data":"0f0a396484963c3ad377880a609ba618648358d34310c7a5ad7b34ccf37082c5"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.397909 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0fd37648284340560511e9d27fd5708f9a8ca6e1813577c1d9883d4a76856185"} Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.399568 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.3995560129999998 podStartE2EDuration="2.399556013s" podCreationTimestamp="2025-12-05 20:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:11.392943932 +0000 UTC m=+150.204160061" watchObservedRunningTime="2025-12-05 20:14:11.399556013 +0000 UTC m=+150.210772112" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.403278 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-catalog-content\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.403319 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24f9\" (UniqueName: \"kubernetes.io/projected/ed3e3607-e411-47d4-8a20-cbd6623ec89e-kube-api-access-d24f9\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.403348 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-utilities\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.403733 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-utilities\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.405181 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-catalog-content\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.449257 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24f9\" (UniqueName: \"kubernetes.io/projected/ed3e3607-e411-47d4-8a20-cbd6623ec89e-kube-api-access-d24f9\") pod \"redhat-operators-87qk8\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.535813 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrg9c"] Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.659267 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.809762 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.909424 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hltr7\" (UniqueName: \"kubernetes.io/projected/d8f5a146-90e5-47e2-a639-78a09eb00231-kube-api-access-hltr7\") pod \"d8f5a146-90e5-47e2-a639-78a09eb00231\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.909531 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f5a146-90e5-47e2-a639-78a09eb00231-config-volume\") pod \"d8f5a146-90e5-47e2-a639-78a09eb00231\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.909619 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f5a146-90e5-47e2-a639-78a09eb00231-secret-volume\") pod \"d8f5a146-90e5-47e2-a639-78a09eb00231\" (UID: \"d8f5a146-90e5-47e2-a639-78a09eb00231\") " Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.911215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f5a146-90e5-47e2-a639-78a09eb00231-config-volume" (OuterVolumeSpecName: "config-volume") pod "d8f5a146-90e5-47e2-a639-78a09eb00231" (UID: "d8f5a146-90e5-47e2-a639-78a09eb00231"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.915493 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f5a146-90e5-47e2-a639-78a09eb00231-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d8f5a146-90e5-47e2-a639-78a09eb00231" (UID: "d8f5a146-90e5-47e2-a639-78a09eb00231"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.915786 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f5a146-90e5-47e2-a639-78a09eb00231-kube-api-access-hltr7" (OuterVolumeSpecName: "kube-api-access-hltr7") pod "d8f5a146-90e5-47e2-a639-78a09eb00231" (UID: "d8f5a146-90e5-47e2-a639-78a09eb00231"). InnerVolumeSpecName "kube-api-access-hltr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.942916 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:11 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:11 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:11 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:11 crc kubenswrapper[4904]: I1205 20:14:11.942982 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.010908 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f5a146-90e5-47e2-a639-78a09eb00231-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.010947 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f5a146-90e5-47e2-a639-78a09eb00231-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.010960 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hltr7\" (UniqueName: \"kubernetes.io/projected/d8f5a146-90e5-47e2-a639-78a09eb00231-kube-api-access-hltr7\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.087436 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87qk8"] Dec 05 20:14:12 crc kubenswrapper[4904]: W1205 20:14:12.106929 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded3e3607_e411_47d4_8a20_cbd6623ec89e.slice/crio-e076caa621902641fc9067f112bc170eb600d6c903052cd16f79c7f825301b38 WatchSource:0}: Error finding container e076caa621902641fc9067f112bc170eb600d6c903052cd16f79c7f825301b38: Status 404 returned error can't find the container with id e076caa621902641fc9067f112bc170eb600d6c903052cd16f79c7f825301b38 Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.405669 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" event={"ID":"d8f5a146-90e5-47e2-a639-78a09eb00231","Type":"ContainerDied","Data":"d6ab658002c7c5f6f9f2530c2b62200a491d72ff7650009541d35f71beb87663"} Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.405973 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ab658002c7c5f6f9f2530c2b62200a491d72ff7650009541d35f71beb87663" Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.405906 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg" Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.424786 4904 generic.go:334] "Generic (PLEG): container finished" podID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerID="b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd" exitCode=0 Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.424823 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerDied","Data":"b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd"} Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.424877 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerStarted","Data":"c74ed644ca8a064934a2bd6b76abb866a8121827effc2f7f2d9c1a087d500723"} Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.427110 4904 generic.go:334] "Generic (PLEG): container finished" podID="df234b70-dbe9-49e0-bc5c-ba8078c086cf" containerID="fde0f8c2a12f543c74a99b05f803a472ca3c26f1ca3da44a6ee26532d11e329b" exitCode=0 Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.427209 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df234b70-dbe9-49e0-bc5c-ba8078c086cf","Type":"ContainerDied","Data":"fde0f8c2a12f543c74a99b05f803a472ca3c26f1ca3da44a6ee26532d11e329b"} Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.443441 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerStarted","Data":"245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac"} Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.443504 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerStarted","Data":"e076caa621902641fc9067f112bc170eb600d6c903052cd16f79c7f825301b38"} Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.929376 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:12 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:12 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:12 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:12 crc kubenswrapper[4904]: I1205 20:14:12.929427 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.446950 4904 generic.go:334] "Generic (PLEG): container finished" podID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerID="245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac" exitCode=0 Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.446987 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerDied","Data":"245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac"} Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.785878 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.837266 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kubelet-dir\") pod \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.837352 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kube-api-access\") pod \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\" (UID: \"df234b70-dbe9-49e0-bc5c-ba8078c086cf\") " Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.837537 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df234b70-dbe9-49e0-bc5c-ba8078c086cf" (UID: "df234b70-dbe9-49e0-bc5c-ba8078c086cf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.837797 4904 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.859029 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df234b70-dbe9-49e0-bc5c-ba8078c086cf" (UID: "df234b70-dbe9-49e0-bc5c-ba8078c086cf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.929643 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:13 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:13 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:13 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.929709 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:13 crc kubenswrapper[4904]: I1205 20:14:13.939216 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df234b70-dbe9-49e0-bc5c-ba8078c086cf-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.465202 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df234b70-dbe9-49e0-bc5c-ba8078c086cf","Type":"ContainerDied","Data":"d2d0bfd7f9f79bd9d01c8c3ce035c3d703a24ab49c510ed702e0e69d902e475f"} Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.465240 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d0bfd7f9f79bd9d01c8c3ce035c3d703a24ab49c510ed702e0e69d902e475f" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.465263 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.542679 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.552276 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bvmh2" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.887511 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:14:14 crc kubenswrapper[4904]: E1205 20:14:14.887765 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f5a146-90e5-47e2-a639-78a09eb00231" containerName="collect-profiles" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.887779 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f5a146-90e5-47e2-a639-78a09eb00231" containerName="collect-profiles" Dec 05 20:14:14 crc kubenswrapper[4904]: E1205 20:14:14.887798 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df234b70-dbe9-49e0-bc5c-ba8078c086cf" containerName="pruner" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.887806 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="df234b70-dbe9-49e0-bc5c-ba8078c086cf" containerName="pruner" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.887931 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f5a146-90e5-47e2-a639-78a09eb00231" containerName="collect-profiles" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.887953 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="df234b70-dbe9-49e0-bc5c-ba8078c086cf" containerName="pruner" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.888415 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.894533 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.898700 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.898911 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.933706 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:14 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:14 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:14 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.933771 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.953793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:14 crc kubenswrapper[4904]: I1205 20:14:14.953977 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.055932 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.056015 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.056179 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.073385 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.227179 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.452859 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z6nb9" Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.743590 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.930182 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:15 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:15 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:15 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:15 crc kubenswrapper[4904]: I1205 20:14:15.930235 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:16 crc kubenswrapper[4904]: I1205 20:14:16.489064 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30fe1bd9-3d47-4269-afae-7d9c7f5ab356","Type":"ContainerStarted","Data":"43427b4dfc3d5be7e63b9b09b1551f8d941d354e44325311b131682885b862ae"} Dec 05 20:14:16 crc kubenswrapper[4904]: I1205 20:14:16.930371 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:16 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:16 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:16 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:16 crc kubenswrapper[4904]: I1205 20:14:16.930441 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:17 crc kubenswrapper[4904]: I1205 20:14:17.499364 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30fe1bd9-3d47-4269-afae-7d9c7f5ab356","Type":"ContainerStarted","Data":"bc9ba08678ecd1ce578075ea92072cad1764ed2a6c849b4bc886c85663999755"} Dec 05 20:14:17 crc kubenswrapper[4904]: I1205 20:14:17.516479 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.516459664 podStartE2EDuration="3.516459664s" podCreationTimestamp="2025-12-05 20:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:17.511371778 +0000 UTC m=+156.322587907" watchObservedRunningTime="2025-12-05 20:14:17.516459664 +0000 UTC m=+156.327675763" Dec 05 20:14:17 crc kubenswrapper[4904]: I1205 20:14:17.930187 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:17 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:17 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:17 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:17 crc kubenswrapper[4904]: I1205 20:14:17.930238 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:18 crc kubenswrapper[4904]: I1205 20:14:18.542607 4904 generic.go:334] "Generic (PLEG): container finished" podID="30fe1bd9-3d47-4269-afae-7d9c7f5ab356" containerID="bc9ba08678ecd1ce578075ea92072cad1764ed2a6c849b4bc886c85663999755" exitCode=0 Dec 05 20:14:18 crc kubenswrapper[4904]: I1205 20:14:18.542682 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30fe1bd9-3d47-4269-afae-7d9c7f5ab356","Type":"ContainerDied","Data":"bc9ba08678ecd1ce578075ea92072cad1764ed2a6c849b4bc886c85663999755"} Dec 05 20:14:18 crc kubenswrapper[4904]: I1205 20:14:18.758513 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:14:18 crc kubenswrapper[4904]: I1205 20:14:18.929461 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:18 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:18 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:18 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:18 crc kubenswrapper[4904]: I1205 20:14:18.929527 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:19 crc kubenswrapper[4904]: I1205 20:14:19.493248 4904 patch_prober.go:28] interesting pod/console-f9d7485db-pdnb5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 05 20:14:19 crc kubenswrapper[4904]: I1205 20:14:19.493325 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pdnb5" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 05 20:14:19 crc kubenswrapper[4904]: I1205 20:14:19.929554 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:19 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:19 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:19 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:19 crc kubenswrapper[4904]: I1205 20:14:19.929839 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:20 crc kubenswrapper[4904]: I1205 20:14:20.508896 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-672dp" Dec 05 20:14:20 crc kubenswrapper[4904]: I1205 20:14:20.928788 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:20 crc kubenswrapper[4904]: [-]has-synced failed: reason withheld Dec 05 20:14:20 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:20 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:20 crc kubenswrapper[4904]: I1205 20:14:20.928876 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:21 crc kubenswrapper[4904]: I1205 20:14:21.934657 4904 patch_prober.go:28] interesting pod/router-default-5444994796-tkgxl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:14:21 crc kubenswrapper[4904]: [+]has-synced ok Dec 05 20:14:21 crc kubenswrapper[4904]: [+]process-running ok Dec 05 20:14:21 crc kubenswrapper[4904]: healthz check failed Dec 05 20:14:21 crc kubenswrapper[4904]: I1205 20:14:21.935168 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tkgxl" podUID="4589bc8e-2864-4d7e-a049-a3bf264bb997" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:14:22 crc kubenswrapper[4904]: I1205 20:14:22.929720 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:14:22 crc kubenswrapper[4904]: I1205 20:14:22.932652 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tkgxl" Dec 05 20:14:23 crc kubenswrapper[4904]: I1205 20:14:23.548134 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:14:23 crc kubenswrapper[4904]: I1205 20:14:23.553186 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91-metrics-certs\") pod \"network-metrics-daemon-d8xkk\" (UID: \"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91\") " pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:14:23 crc kubenswrapper[4904]: I1205 20:14:23.634159 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d8xkk" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.548603 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.636313 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30fe1bd9-3d47-4269-afae-7d9c7f5ab356","Type":"ContainerDied","Data":"43427b4dfc3d5be7e63b9b09b1551f8d941d354e44325311b131682885b862ae"} Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.636380 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43427b4dfc3d5be7e63b9b09b1551f8d941d354e44325311b131682885b862ae" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.636466 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.668305 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kube-api-access\") pod \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.668475 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kubelet-dir\") pod \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\" (UID: \"30fe1bd9-3d47-4269-afae-7d9c7f5ab356\") " Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.668561 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "30fe1bd9-3d47-4269-afae-7d9c7f5ab356" (UID: "30fe1bd9-3d47-4269-afae-7d9c7f5ab356"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.668903 4904 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.674350 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "30fe1bd9-3d47-4269-afae-7d9c7f5ab356" (UID: "30fe1bd9-3d47-4269-afae-7d9c7f5ab356"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:24 crc kubenswrapper[4904]: I1205 20:14:24.770615 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30fe1bd9-3d47-4269-afae-7d9c7f5ab356-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:28 crc kubenswrapper[4904]: I1205 20:14:28.400471 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:14:29 crc kubenswrapper[4904]: I1205 20:14:29.495900 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:14:29 crc kubenswrapper[4904]: I1205 20:14:29.499218 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:14:29 crc kubenswrapper[4904]: I1205 20:14:29.955816 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:14:29 crc kubenswrapper[4904]: I1205 20:14:29.955866 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:14:40 crc kubenswrapper[4904]: I1205 20:14:40.080766 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjzcx" Dec 05 20:14:44 crc kubenswrapper[4904]: E1205 20:14:44.975711 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 20:14:44 crc kubenswrapper[4904]: E1205 20:14:44.976592 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpjhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-g4tfw_openshift-marketplace(3aaaf574-a341-4efb-beae-169d69ca32d4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:44 crc kubenswrapper[4904]: E1205 20:14:44.977781 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-g4tfw" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.835576 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-g4tfw" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.895042 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.895228 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wnmw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vd9gt_openshift-marketplace(757462be-80d7-44c2-a193-48f78ac5b80e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.896471 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vd9gt" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.924709 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.924864 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pwcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vjj47_openshift-marketplace(9cff387b-9124-46db-8d06-5cb5839e0a12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:45 crc kubenswrapper[4904]: E1205 20:14:45.926107 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vjj47" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" Dec 05 20:14:47 crc kubenswrapper[4904]: E1205 20:14:47.070416 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vjj47" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" Dec 05 20:14:47 crc kubenswrapper[4904]: E1205 20:14:47.070818 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vd9gt" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" Dec 05 20:14:47 crc kubenswrapper[4904]: E1205 20:14:47.155586 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 20:14:47 crc kubenswrapper[4904]: E1205 20:14:47.155752 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vl7gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-plmk6_openshift-marketplace(9ef986b1-43e3-4c24-87d0-f385d1ae80bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:47 crc kubenswrapper[4904]: E1205 20:14:47.157340 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-plmk6" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" Dec 05 20:14:49 crc kubenswrapper[4904]: I1205 20:14:49.228630 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:14:49 crc kubenswrapper[4904]: E1205 20:14:49.898383 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-plmk6" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" Dec 05 20:14:49 crc kubenswrapper[4904]: E1205 20:14:49.999782 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:49.999930 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8zpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gpt4n_openshift-marketplace(d5752678-2a85-49a8-b6d4-63b4adb96277): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.001089 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gpt4n" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.009045 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.009199 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghsvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sr8r7_openshift-marketplace(2ea52ad5-455a-4f4b-a576-fa76c35bcbea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.010350 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sr8r7" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.012521 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.012619 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9z6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jrg9c_openshift-marketplace(59e498ac-1ef4-4984-956a-f1d7614d5dba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.014539 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jrg9c" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.022930 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.023130 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d24f9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-87qk8_openshift-marketplace(ed3e3607-e411-47d4-8a20-cbd6623ec89e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.025269 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-87qk8" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.283120 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.283346 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fe1bd9-3d47-4269-afae-7d9c7f5ab356" containerName="pruner" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.283357 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fe1bd9-3d47-4269-afae-7d9c7f5ab356" containerName="pruner" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.283456 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fe1bd9-3d47-4269-afae-7d9c7f5ab356" containerName="pruner" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.284730 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.286757 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.287987 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.295368 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.325316 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d8xkk"] Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.410841 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.410921 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.511977 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.512086 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.512184 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.528841 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.611453 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:50 crc kubenswrapper[4904]: I1205 20:14:50.792903 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" event={"ID":"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91","Type":"ContainerStarted","Data":"2488e1a0e2ecaad7072f643928dacaa552711e45686a5c6a1f0e0514f43ace64"} Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.794762 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sr8r7" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.795149 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-87qk8" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" Dec 05 20:14:50 crc kubenswrapper[4904]: E1205 20:14:50.795178 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jrg9c" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.014127 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:14:51 crc kubenswrapper[4904]: W1205 20:14:51.023985 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5d4c4648_2bd2_4b3f_831a_36d13fe03a72.slice/crio-1d62b4a47dafe2972e93be9da0b8d16158965ef16ab4a193bb5c0b8b908bcf06 WatchSource:0}: Error finding container 1d62b4a47dafe2972e93be9da0b8d16158965ef16ab4a193bb5c0b8b908bcf06: Status 404 returned error can't find the container with id 1d62b4a47dafe2972e93be9da0b8d16158965ef16ab4a193bb5c0b8b908bcf06 Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.801894 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" event={"ID":"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91","Type":"ContainerStarted","Data":"9da0d87c65881891e687e20013ac8cc6ff013416c2050f36d03c1bb23698b087"} Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.802311 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d8xkk" event={"ID":"fb149bcb-a978-4b2b-8dea-8f8ce0fd3d91","Type":"ContainerStarted","Data":"9abf4e1da2f53a0ff8a7987c7de8d048f396c99d6ec8aa8d63acfeb03d541935"} Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.805181 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d4c4648-2bd2-4b3f-831a-36d13fe03a72","Type":"ContainerStarted","Data":"8233c81fed8e64ed369eec3c09d2953ff30166efd35f6715bdb69bb70ee2090f"} Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.805228 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d4c4648-2bd2-4b3f-831a-36d13fe03a72","Type":"ContainerStarted","Data":"1d62b4a47dafe2972e93be9da0b8d16158965ef16ab4a193bb5c0b8b908bcf06"} Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.840577 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d8xkk" podStartSLOduration=170.840553682 podStartE2EDuration="2m50.840553682s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:51.826780025 +0000 UTC m=+190.637996214" watchObservedRunningTime="2025-12-05 20:14:51.840553682 +0000 UTC m=+190.651769831" Dec 05 20:14:51 crc kubenswrapper[4904]: I1205 20:14:51.851275 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.8512588399999999 podStartE2EDuration="1.85125884s" podCreationTimestamp="2025-12-05 20:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:51.84463883 +0000 UTC m=+190.655854949" watchObservedRunningTime="2025-12-05 20:14:51.85125884 +0000 UTC m=+190.662474979" Dec 05 20:14:52 crc kubenswrapper[4904]: I1205 20:14:52.811585 4904 generic.go:334] "Generic (PLEG): container finished" podID="5d4c4648-2bd2-4b3f-831a-36d13fe03a72" containerID="8233c81fed8e64ed369eec3c09d2953ff30166efd35f6715bdb69bb70ee2090f" exitCode=0 Dec 05 20:14:52 crc kubenswrapper[4904]: I1205 20:14:52.811654 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d4c4648-2bd2-4b3f-831a-36d13fe03a72","Type":"ContainerDied","Data":"8233c81fed8e64ed369eec3c09d2953ff30166efd35f6715bdb69bb70ee2090f"} Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.057311 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.155006 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kubelet-dir\") pod \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.155087 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kube-api-access\") pod \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\" (UID: \"5d4c4648-2bd2-4b3f-831a-36d13fe03a72\") " Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.155148 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5d4c4648-2bd2-4b3f-831a-36d13fe03a72" (UID: "5d4c4648-2bd2-4b3f-831a-36d13fe03a72"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.155397 4904 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.160717 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5d4c4648-2bd2-4b3f-831a-36d13fe03a72" (UID: "5d4c4648-2bd2-4b3f-831a-36d13fe03a72"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.256965 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d4c4648-2bd2-4b3f-831a-36d13fe03a72-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.821983 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d4c4648-2bd2-4b3f-831a-36d13fe03a72","Type":"ContainerDied","Data":"1d62b4a47dafe2972e93be9da0b8d16158965ef16ab4a193bb5c0b8b908bcf06"} Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.822027 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:14:54 crc kubenswrapper[4904]: I1205 20:14:54.822022 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d62b4a47dafe2972e93be9da0b8d16158965ef16ab4a193bb5c0b8b908bcf06" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.076533 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:14:57 crc kubenswrapper[4904]: E1205 20:14:57.076963 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4c4648-2bd2-4b3f-831a-36d13fe03a72" containerName="pruner" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.076975 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4c4648-2bd2-4b3f-831a-36d13fe03a72" containerName="pruner" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.077082 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4c4648-2bd2-4b3f-831a-36d13fe03a72" containerName="pruner" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.077454 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.083764 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.083785 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.093606 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.190924 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.190961 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-var-lock\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.190987 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kube-api-access\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.292350 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.292397 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-var-lock\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.292425 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kube-api-access\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.292539 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.292608 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-var-lock\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.309867 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kube-api-access\") pod \"installer-9-crc\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.398872 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:57 crc kubenswrapper[4904]: I1205 20:14:57.826790 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:14:58 crc kubenswrapper[4904]: I1205 20:14:58.844748 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9","Type":"ContainerStarted","Data":"11c3fb8aab8fb03ee3058cc38f7911521e7aea209a63e9b7ddff1afd7a72e300"} Dec 05 20:14:58 crc kubenswrapper[4904]: I1205 20:14:58.845071 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9","Type":"ContainerStarted","Data":"12a3952f587f09f806145f7b7303b190ebf667b5f3b8b9800ccf915c39c7b7dd"} Dec 05 20:14:58 crc kubenswrapper[4904]: I1205 20:14:58.863492 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.863475998 podStartE2EDuration="1.863475998s" podCreationTimestamp="2025-12-05 20:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:14:58.861633946 +0000 UTC m=+197.672850055" watchObservedRunningTime="2025-12-05 20:14:58.863475998 +0000 UTC m=+197.674692117" Dec 05 20:14:58 crc kubenswrapper[4904]: I1205 20:14:58.998413 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s87dr"] Dec 05 20:14:59 crc kubenswrapper[4904]: I1205 20:14:59.956106 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:14:59 crc kubenswrapper[4904]: I1205 20:14:59.956412 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.128613 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx"] Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.129287 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.131230 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.131293 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.142400 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx"] Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.227881 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b269\" (UniqueName: \"kubernetes.io/projected/28256b97-09a9-4ddf-a730-d2fc4d926310-kube-api-access-5b269\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.228012 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28256b97-09a9-4ddf-a730-d2fc4d926310-secret-volume\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.228075 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28256b97-09a9-4ddf-a730-d2fc4d926310-config-volume\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.330007 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b269\" (UniqueName: \"kubernetes.io/projected/28256b97-09a9-4ddf-a730-d2fc4d926310-kube-api-access-5b269\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.330096 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28256b97-09a9-4ddf-a730-d2fc4d926310-secret-volume\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.331152 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28256b97-09a9-4ddf-a730-d2fc4d926310-config-volume\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.337430 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28256b97-09a9-4ddf-a730-d2fc4d926310-secret-volume\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.339585 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28256b97-09a9-4ddf-a730-d2fc4d926310-config-volume\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.346641 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b269\" (UniqueName: \"kubernetes.io/projected/28256b97-09a9-4ddf-a730-d2fc4d926310-kube-api-access-5b269\") pod \"collect-profiles-29416095-g7dzx\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.490790 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.857220 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerStarted","Data":"fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37"} Dec 05 20:15:00 crc kubenswrapper[4904]: I1205 20:15:00.865543 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx"] Dec 05 20:15:00 crc kubenswrapper[4904]: W1205 20:15:00.874158 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28256b97_09a9_4ddf_a730_d2fc4d926310.slice/crio-5e771b333c0b1db5b482d29c2094884a3a9cd15dcd01f23ba1167021ce08688a WatchSource:0}: Error finding container 5e771b333c0b1db5b482d29c2094884a3a9cd15dcd01f23ba1167021ce08688a: Status 404 returned error can't find the container with id 5e771b333c0b1db5b482d29c2094884a3a9cd15dcd01f23ba1167021ce08688a Dec 05 20:15:01 crc kubenswrapper[4904]: I1205 20:15:01.863111 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" event={"ID":"28256b97-09a9-4ddf-a730-d2fc4d926310","Type":"ContainerStarted","Data":"343df926b25a4e8e58ab4ee6a22a996cc3bc6c6988a24dc4b0ef6591839eaebe"} Dec 05 20:15:01 crc kubenswrapper[4904]: I1205 20:15:01.863437 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" event={"ID":"28256b97-09a9-4ddf-a730-d2fc4d926310","Type":"ContainerStarted","Data":"5e771b333c0b1db5b482d29c2094884a3a9cd15dcd01f23ba1167021ce08688a"} Dec 05 20:15:01 crc kubenswrapper[4904]: I1205 20:15:01.864841 4904 generic.go:334] "Generic (PLEG): container finished" podID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerID="fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37" exitCode=0 Dec 05 20:15:01 crc kubenswrapper[4904]: I1205 20:15:01.864876 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerDied","Data":"fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37"} Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.799562 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cncqb"] Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.800093 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" podUID="834a438d-9f7d-4707-b790-8ce136081f7c" containerName="controller-manager" containerID="cri-o://acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203" gracePeriod=30 Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.873657 4904 generic.go:334] "Generic (PLEG): container finished" podID="757462be-80d7-44c2-a193-48f78ac5b80e" containerID="909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e" exitCode=0 Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.873733 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd9gt" event={"ID":"757462be-80d7-44c2-a193-48f78ac5b80e","Type":"ContainerDied","Data":"909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e"} Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.877982 4904 generic.go:334] "Generic (PLEG): container finished" podID="28256b97-09a9-4ddf-a730-d2fc4d926310" containerID="343df926b25a4e8e58ab4ee6a22a996cc3bc6c6988a24dc4b0ef6591839eaebe" exitCode=0 Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.878161 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" event={"ID":"28256b97-09a9-4ddf-a730-d2fc4d926310","Type":"ContainerDied","Data":"343df926b25a4e8e58ab4ee6a22a996cc3bc6c6988a24dc4b0ef6591839eaebe"} Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.891631 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerStarted","Data":"3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031"} Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.898202 4904 generic.go:334] "Generic (PLEG): container finished" podID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerID="beb8ef10802bde44a616822095c3c115322d6837446db6f5b0bdeaf4b0ac345d" exitCode=0 Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.898245 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4tfw" event={"ID":"3aaaf574-a341-4efb-beae-169d69ca32d4","Type":"ContainerDied","Data":"beb8ef10802bde44a616822095c3c115322d6837446db6f5b0bdeaf4b0ac345d"} Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.920107 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc"] Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.920386 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" podUID="eb4b4c10-971e-4766-b632-9f710ec547a6" containerName="route-controller-manager" containerID="cri-o://8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97" gracePeriod=30 Dec 05 20:15:02 crc kubenswrapper[4904]: I1205 20:15:02.948896 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vjj47" podStartSLOduration=3.607595359 podStartE2EDuration="55.948880542s" podCreationTimestamp="2025-12-05 20:14:07 +0000 UTC" firstStartedPulling="2025-12-05 20:14:10.189753471 +0000 UTC m=+149.000969580" lastFinishedPulling="2025-12-05 20:15:02.531038654 +0000 UTC m=+201.342254763" observedRunningTime="2025-12-05 20:15:02.948655067 +0000 UTC m=+201.759871196" watchObservedRunningTime="2025-12-05 20:15:02.948880542 +0000 UTC m=+201.760096651" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.211563 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.273538 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-client-ca\") pod \"834a438d-9f7d-4707-b790-8ce136081f7c\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.273769 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/834a438d-9f7d-4707-b790-8ce136081f7c-serving-cert\") pod \"834a438d-9f7d-4707-b790-8ce136081f7c\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.273797 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-config\") pod \"834a438d-9f7d-4707-b790-8ce136081f7c\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.273862 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcw4c\" (UniqueName: \"kubernetes.io/projected/834a438d-9f7d-4707-b790-8ce136081f7c-kube-api-access-gcw4c\") pod \"834a438d-9f7d-4707-b790-8ce136081f7c\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.274530 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-client-ca" (OuterVolumeSpecName: "client-ca") pod "834a438d-9f7d-4707-b790-8ce136081f7c" (UID: "834a438d-9f7d-4707-b790-8ce136081f7c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.274638 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-proxy-ca-bundles\") pod \"834a438d-9f7d-4707-b790-8ce136081f7c\" (UID: \"834a438d-9f7d-4707-b790-8ce136081f7c\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.281142 4904 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.284041 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-config" (OuterVolumeSpecName: "config") pod "834a438d-9f7d-4707-b790-8ce136081f7c" (UID: "834a438d-9f7d-4707-b790-8ce136081f7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.285976 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "834a438d-9f7d-4707-b790-8ce136081f7c" (UID: "834a438d-9f7d-4707-b790-8ce136081f7c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.290621 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834a438d-9f7d-4707-b790-8ce136081f7c-kube-api-access-gcw4c" (OuterVolumeSpecName: "kube-api-access-gcw4c") pod "834a438d-9f7d-4707-b790-8ce136081f7c" (UID: "834a438d-9f7d-4707-b790-8ce136081f7c"). InnerVolumeSpecName "kube-api-access-gcw4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.292635 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834a438d-9f7d-4707-b790-8ce136081f7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "834a438d-9f7d-4707-b790-8ce136081f7c" (UID: "834a438d-9f7d-4707-b790-8ce136081f7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.382954 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/834a438d-9f7d-4707-b790-8ce136081f7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.383014 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.383030 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcw4c\" (UniqueName: \"kubernetes.io/projected/834a438d-9f7d-4707-b790-8ce136081f7c-kube-api-access-gcw4c\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.383083 4904 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/834a438d-9f7d-4707-b790-8ce136081f7c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.416379 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.483570 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-client-ca\") pod \"eb4b4c10-971e-4766-b632-9f710ec547a6\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.483650 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhnpg\" (UniqueName: \"kubernetes.io/projected/eb4b4c10-971e-4766-b632-9f710ec547a6-kube-api-access-fhnpg\") pod \"eb4b4c10-971e-4766-b632-9f710ec547a6\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.483692 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-config\") pod \"eb4b4c10-971e-4766-b632-9f710ec547a6\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.483718 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b4c10-971e-4766-b632-9f710ec547a6-serving-cert\") pod \"eb4b4c10-971e-4766-b632-9f710ec547a6\" (UID: \"eb4b4c10-971e-4766-b632-9f710ec547a6\") " Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.503591 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb4b4c10-971e-4766-b632-9f710ec547a6" (UID: "eb4b4c10-971e-4766-b632-9f710ec547a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.502366 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-config" (OuterVolumeSpecName: "config") pod "eb4b4c10-971e-4766-b632-9f710ec547a6" (UID: "eb4b4c10-971e-4766-b632-9f710ec547a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.505538 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4b4c10-971e-4766-b632-9f710ec547a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb4b4c10-971e-4766-b632-9f710ec547a6" (UID: "eb4b4c10-971e-4766-b632-9f710ec547a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.521296 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4b4c10-971e-4766-b632-9f710ec547a6-kube-api-access-fhnpg" (OuterVolumeSpecName: "kube-api-access-fhnpg") pod "eb4b4c10-971e-4766-b632-9f710ec547a6" (UID: "eb4b4c10-971e-4766-b632-9f710ec547a6"). InnerVolumeSpecName "kube-api-access-fhnpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.585375 4904 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.585412 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhnpg\" (UniqueName: \"kubernetes.io/projected/eb4b4c10-971e-4766-b632-9f710ec547a6-kube-api-access-fhnpg\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.585425 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4b4c10-971e-4766-b632-9f710ec547a6-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.585433 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b4c10-971e-4766-b632-9f710ec547a6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.904549 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerStarted","Data":"9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.906851 4904 generic.go:334] "Generic (PLEG): container finished" podID="eb4b4c10-971e-4766-b632-9f710ec547a6" containerID="8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97" exitCode=0 Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.907029 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" event={"ID":"eb4b4c10-971e-4766-b632-9f710ec547a6","Type":"ContainerDied","Data":"8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.907272 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" event={"ID":"eb4b4c10-971e-4766-b632-9f710ec547a6","Type":"ContainerDied","Data":"a8cd5364e3c549b9a1db98a9f1a16e750633bd970396714d9bd256a95a371d22"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.907292 4904 scope.go:117] "RemoveContainer" containerID="8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.907103 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.909921 4904 generic.go:334] "Generic (PLEG): container finished" podID="834a438d-9f7d-4707-b790-8ce136081f7c" containerID="acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203" exitCode=0 Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.909971 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" event={"ID":"834a438d-9f7d-4707-b790-8ce136081f7c","Type":"ContainerDied","Data":"acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.909997 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" event={"ID":"834a438d-9f7d-4707-b790-8ce136081f7c","Type":"ContainerDied","Data":"52ab922c318333b9aa6ad379b5508622f6f1ed384f1f7737d590b5ae1c26fcd4"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.910091 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cncqb" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.912771 4904 generic.go:334] "Generic (PLEG): container finished" podID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerID="b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3" exitCode=0 Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.912809 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sr8r7" event={"ID":"2ea52ad5-455a-4f4b-a576-fa76c35bcbea","Type":"ContainerDied","Data":"b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.915004 4904 generic.go:334] "Generic (PLEG): container finished" podID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerID="8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8" exitCode=0 Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.915070 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpt4n" event={"ID":"d5752678-2a85-49a8-b6d4-63b4adb96277","Type":"ContainerDied","Data":"8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.922729 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerStarted","Data":"8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f"} Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.930625 4904 scope.go:117] "RemoveContainer" containerID="8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97" Dec 05 20:15:03 crc kubenswrapper[4904]: E1205 20:15:03.931043 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97\": container with ID starting with 8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97 not found: ID does not exist" containerID="8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.931134 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97"} err="failed to get container status \"8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97\": rpc error: code = NotFound desc = could not find container \"8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97\": container with ID starting with 8904060565b040d35f64808edfb159781a589227443fd9d003561f9dd3d83f97 not found: ID does not exist" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.931177 4904 scope.go:117] "RemoveContainer" containerID="acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.951137 4904 scope.go:117] "RemoveContainer" containerID="acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203" Dec 05 20:15:03 crc kubenswrapper[4904]: E1205 20:15:03.951606 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203\": container with ID starting with acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203 not found: ID does not exist" containerID="acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.951654 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203"} err="failed to get container status \"acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203\": rpc error: code = NotFound desc = could not find container \"acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203\": container with ID starting with acb3556fb62ff946585e1995e3c8f63725adda35d0a265de073d8873325f8203 not found: ID does not exist" Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.992104 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cncqb"] Dec 05 20:15:03 crc kubenswrapper[4904]: I1205 20:15:03.994747 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cncqb"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.032859 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.035782 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j9gnc"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.157981 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95"] Dec 05 20:15:04 crc kubenswrapper[4904]: E1205 20:15:04.158528 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4b4c10-971e-4766-b632-9f710ec547a6" containerName="route-controller-manager" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.158573 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4b4c10-971e-4766-b632-9f710ec547a6" containerName="route-controller-manager" Dec 05 20:15:04 crc kubenswrapper[4904]: E1205 20:15:04.158587 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834a438d-9f7d-4707-b790-8ce136081f7c" containerName="controller-manager" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.158596 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="834a438d-9f7d-4707-b790-8ce136081f7c" containerName="controller-manager" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.158741 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4b4c10-971e-4766-b632-9f710ec547a6" containerName="route-controller-manager" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.158756 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="834a438d-9f7d-4707-b790-8ce136081f7c" containerName="controller-manager" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.159274 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.162518 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.162921 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.163095 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.163222 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.163349 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.163529 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.167251 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fbc996c8d-z4fml"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.168316 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.170803 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.170955 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.171807 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.171957 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.172191 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.173835 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.176276 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.180877 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.185349 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fbc996c8d-z4fml"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.228138 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294020 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28256b97-09a9-4ddf-a730-d2fc4d926310-secret-volume\") pod \"28256b97-09a9-4ddf-a730-d2fc4d926310\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294254 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b269\" (UniqueName: \"kubernetes.io/projected/28256b97-09a9-4ddf-a730-d2fc4d926310-kube-api-access-5b269\") pod \"28256b97-09a9-4ddf-a730-d2fc4d926310\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294297 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28256b97-09a9-4ddf-a730-d2fc4d926310-config-volume\") pod \"28256b97-09a9-4ddf-a730-d2fc4d926310\" (UID: \"28256b97-09a9-4ddf-a730-d2fc4d926310\") " Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294539 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-proxy-ca-bundles\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294582 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-client-ca\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294610 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-serving-cert\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294646 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-config\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294673 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nmn\" (UniqueName: \"kubernetes.io/projected/1b936de0-3def-401f-9706-2d10eba86b7b-kube-api-access-m7nmn\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294706 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptjx\" (UniqueName: \"kubernetes.io/projected/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-kube-api-access-jptjx\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294734 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b936de0-3def-401f-9706-2d10eba86b7b-serving-cert\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294757 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-config\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.294788 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-client-ca\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.296027 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28256b97-09a9-4ddf-a730-d2fc4d926310-config-volume" (OuterVolumeSpecName: "config-volume") pod "28256b97-09a9-4ddf-a730-d2fc4d926310" (UID: "28256b97-09a9-4ddf-a730-d2fc4d926310"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.301422 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28256b97-09a9-4ddf-a730-d2fc4d926310-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28256b97-09a9-4ddf-a730-d2fc4d926310" (UID: "28256b97-09a9-4ddf-a730-d2fc4d926310"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.301739 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28256b97-09a9-4ddf-a730-d2fc4d926310-kube-api-access-5b269" (OuterVolumeSpecName: "kube-api-access-5b269") pod "28256b97-09a9-4ddf-a730-d2fc4d926310" (UID: "28256b97-09a9-4ddf-a730-d2fc4d926310"). InnerVolumeSpecName "kube-api-access-5b269". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396612 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-client-ca\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396711 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-proxy-ca-bundles\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396742 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-client-ca\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396772 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-serving-cert\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-config\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396834 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nmn\" (UniqueName: \"kubernetes.io/projected/1b936de0-3def-401f-9706-2d10eba86b7b-kube-api-access-m7nmn\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396864 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jptjx\" (UniqueName: \"kubernetes.io/projected/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-kube-api-access-jptjx\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396892 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b936de0-3def-401f-9706-2d10eba86b7b-serving-cert\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396926 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-config\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.396997 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b269\" (UniqueName: \"kubernetes.io/projected/28256b97-09a9-4ddf-a730-d2fc4d926310-kube-api-access-5b269\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.397010 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28256b97-09a9-4ddf-a730-d2fc4d926310-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.397022 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28256b97-09a9-4ddf-a730-d2fc4d926310-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.399067 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-client-ca\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.399837 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-proxy-ca-bundles\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.399841 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-client-ca\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.400934 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-config\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.401188 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-config\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.407835 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b936de0-3def-401f-9706-2d10eba86b7b-serving-cert\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.412430 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-serving-cert\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.417483 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptjx\" (UniqueName: \"kubernetes.io/projected/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-kube-api-access-jptjx\") pod \"route-controller-manager-67485fb985-q9s95\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.418677 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nmn\" (UniqueName: \"kubernetes.io/projected/1b936de0-3def-401f-9706-2d10eba86b7b-kube-api-access-m7nmn\") pod \"controller-manager-fbc996c8d-z4fml\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.510990 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.521588 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.818104 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fbc996c8d-z4fml"] Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.888031 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95"] Dec 05 20:15:04 crc kubenswrapper[4904]: W1205 20:15:04.896592 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe37ec9_fff6_4d9b_a7ae_37114bfcd9c9.slice/crio-650993f34820ae08acc7c3c35d88a9fc038b529ed5b962f47af6fc7425a9b466 WatchSource:0}: Error finding container 650993f34820ae08acc7c3c35d88a9fc038b529ed5b962f47af6fc7425a9b466: Status 404 returned error can't find the container with id 650993f34820ae08acc7c3c35d88a9fc038b529ed5b962f47af6fc7425a9b466 Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.931672 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" event={"ID":"1b936de0-3def-401f-9706-2d10eba86b7b","Type":"ContainerStarted","Data":"56695ce41a4b95e0d104f073cc3a93b69ad046a7b23c7cb263ecbc70d77a827e"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.934322 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd9gt" event={"ID":"757462be-80d7-44c2-a193-48f78ac5b80e","Type":"ContainerStarted","Data":"187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.951023 4904 generic.go:334] "Generic (PLEG): container finished" podID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerID="f547825116d98eb4340141392d9e288a77375530b1d6f9182462333f4f615c40" exitCode=0 Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.951110 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plmk6" event={"ID":"9ef986b1-43e3-4c24-87d0-f385d1ae80bc","Type":"ContainerDied","Data":"f547825116d98eb4340141392d9e288a77375530b1d6f9182462333f4f615c40"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.963149 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4tfw" event={"ID":"3aaaf574-a341-4efb-beae-169d69ca32d4","Type":"ContainerStarted","Data":"a96e3d7eb8d401bf2049b822cda62f714f9d0e9aea743f8220923a51efa42215"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.967983 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" event={"ID":"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9","Type":"ContainerStarted","Data":"650993f34820ae08acc7c3c35d88a9fc038b529ed5b962f47af6fc7425a9b466"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.973300 4904 generic.go:334] "Generic (PLEG): container finished" podID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerID="8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f" exitCode=0 Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.973387 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerDied","Data":"8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.980253 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vd9gt" podStartSLOduration=3.717999138 podStartE2EDuration="55.980231245s" podCreationTimestamp="2025-12-05 20:14:09 +0000 UTC" firstStartedPulling="2025-12-05 20:14:11.383679276 +0000 UTC m=+150.194895385" lastFinishedPulling="2025-12-05 20:15:03.645911383 +0000 UTC m=+202.457127492" observedRunningTime="2025-12-05 20:15:04.959413451 +0000 UTC m=+203.770629560" watchObservedRunningTime="2025-12-05 20:15:04.980231245 +0000 UTC m=+203.791447354" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.980609 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" event={"ID":"28256b97-09a9-4ddf-a730-d2fc4d926310","Type":"ContainerDied","Data":"5e771b333c0b1db5b482d29c2094884a3a9cd15dcd01f23ba1167021ce08688a"} Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.980663 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e771b333c0b1db5b482d29c2094884a3a9cd15dcd01f23ba1167021ce08688a" Dec 05 20:15:04 crc kubenswrapper[4904]: I1205 20:15:04.980737 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx" Dec 05 20:15:05 crc kubenswrapper[4904]: I1205 20:15:05.022434 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g4tfw" podStartSLOduration=3.535876652 podStartE2EDuration="57.022415296s" podCreationTimestamp="2025-12-05 20:14:08 +0000 UTC" firstStartedPulling="2025-12-05 20:14:10.183405737 +0000 UTC m=+148.994621846" lastFinishedPulling="2025-12-05 20:15:03.669944381 +0000 UTC m=+202.481160490" observedRunningTime="2025-12-05 20:15:05.018884693 +0000 UTC m=+203.830100822" watchObservedRunningTime="2025-12-05 20:15:05.022415296 +0000 UTC m=+203.833631415" Dec 05 20:15:05 crc kubenswrapper[4904]: I1205 20:15:05.687466 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834a438d-9f7d-4707-b790-8ce136081f7c" path="/var/lib/kubelet/pods/834a438d-9f7d-4707-b790-8ce136081f7c/volumes" Dec 05 20:15:05 crc kubenswrapper[4904]: I1205 20:15:05.688398 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4b4c10-971e-4766-b632-9f710ec547a6" path="/var/lib/kubelet/pods/eb4b4c10-971e-4766-b632-9f710ec547a6/volumes" Dec 05 20:15:05 crc kubenswrapper[4904]: I1205 20:15:05.988465 4904 generic.go:334] "Generic (PLEG): container finished" podID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerID="9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f" exitCode=0 Dec 05 20:15:05 crc kubenswrapper[4904]: I1205 20:15:05.988508 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerDied","Data":"9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f"} Dec 05 20:15:06 crc kubenswrapper[4904]: I1205 20:15:06.994533 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" event={"ID":"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9","Type":"ContainerStarted","Data":"e845aeee616123ba7f1219a35910039f481168339ef05a1c1a770d71f5b2d461"} Dec 05 20:15:06 crc kubenswrapper[4904]: I1205 20:15:06.994856 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:06 crc kubenswrapper[4904]: I1205 20:15:06.996474 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" event={"ID":"1b936de0-3def-401f-9706-2d10eba86b7b","Type":"ContainerStarted","Data":"993c306b0a53db6e3f095d6cb493ab627a455ffb534af763631364edb6d9c407"} Dec 05 20:15:06 crc kubenswrapper[4904]: I1205 20:15:06.996711 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:07 crc kubenswrapper[4904]: I1205 20:15:07.001001 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:07 crc kubenswrapper[4904]: I1205 20:15:07.003467 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:07 crc kubenswrapper[4904]: I1205 20:15:07.015454 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" podStartSLOduration=5.01543595 podStartE2EDuration="5.01543595s" podCreationTimestamp="2025-12-05 20:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:07.012856661 +0000 UTC m=+205.824072770" watchObservedRunningTime="2025-12-05 20:15:07.01543595 +0000 UTC m=+205.826652059" Dec 05 20:15:07 crc kubenswrapper[4904]: I1205 20:15:07.031362 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" podStartSLOduration=5.031344933 podStartE2EDuration="5.031344933s" podCreationTimestamp="2025-12-05 20:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:07.029915315 +0000 UTC m=+205.841131494" watchObservedRunningTime="2025-12-05 20:15:07.031344933 +0000 UTC m=+205.842561042" Dec 05 20:15:08 crc kubenswrapper[4904]: I1205 20:15:08.226993 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:15:08 crc kubenswrapper[4904]: I1205 20:15:08.227474 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:15:08 crc kubenswrapper[4904]: I1205 20:15:08.462504 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:15:08 crc kubenswrapper[4904]: I1205 20:15:08.590126 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:15:08 crc kubenswrapper[4904]: I1205 20:15:08.590221 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:15:08 crc kubenswrapper[4904]: I1205 20:15:08.650499 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:15:09 crc kubenswrapper[4904]: I1205 20:15:09.065921 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:15:09 crc kubenswrapper[4904]: I1205 20:15:09.070165 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:15:10 crc kubenswrapper[4904]: I1205 20:15:10.000288 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:15:10 crc kubenswrapper[4904]: I1205 20:15:10.000601 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:15:10 crc kubenswrapper[4904]: I1205 20:15:10.041142 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:15:10 crc kubenswrapper[4904]: I1205 20:15:10.108434 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:15:12 crc kubenswrapper[4904]: I1205 20:15:12.025494 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sr8r7" event={"ID":"2ea52ad5-455a-4f4b-a576-fa76c35bcbea","Type":"ContainerStarted","Data":"5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413"} Dec 05 20:15:12 crc kubenswrapper[4904]: I1205 20:15:12.053696 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sr8r7" podStartSLOduration=3.878200531 podStartE2EDuration="1m2.05367924s" podCreationTimestamp="2025-12-05 20:14:10 +0000 UTC" firstStartedPulling="2025-12-05 20:14:11.396936847 +0000 UTC m=+150.208152956" lastFinishedPulling="2025-12-05 20:15:09.572415556 +0000 UTC m=+208.383631665" observedRunningTime="2025-12-05 20:15:12.050472665 +0000 UTC m=+210.861688794" watchObservedRunningTime="2025-12-05 20:15:12.05367924 +0000 UTC m=+210.864895359" Dec 05 20:15:12 crc kubenswrapper[4904]: I1205 20:15:12.331091 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g4tfw"] Dec 05 20:15:12 crc kubenswrapper[4904]: I1205 20:15:12.331405 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g4tfw" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="registry-server" containerID="cri-o://a96e3d7eb8d401bf2049b822cda62f714f9d0e9aea743f8220923a51efa42215" gracePeriod=2 Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.041778 4904 generic.go:334] "Generic (PLEG): container finished" podID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerID="a96e3d7eb8d401bf2049b822cda62f714f9d0e9aea743f8220923a51efa42215" exitCode=0 Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.041851 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4tfw" event={"ID":"3aaaf574-a341-4efb-beae-169d69ca32d4","Type":"ContainerDied","Data":"a96e3d7eb8d401bf2049b822cda62f714f9d0e9aea743f8220923a51efa42215"} Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.339096 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.422176 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-catalog-content\") pod \"3aaaf574-a341-4efb-beae-169d69ca32d4\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.422504 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-utilities\") pod \"3aaaf574-a341-4efb-beae-169d69ca32d4\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.422622 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpjhg\" (UniqueName: \"kubernetes.io/projected/3aaaf574-a341-4efb-beae-169d69ca32d4-kube-api-access-wpjhg\") pod \"3aaaf574-a341-4efb-beae-169d69ca32d4\" (UID: \"3aaaf574-a341-4efb-beae-169d69ca32d4\") " Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.423607 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-utilities" (OuterVolumeSpecName: "utilities") pod "3aaaf574-a341-4efb-beae-169d69ca32d4" (UID: "3aaaf574-a341-4efb-beae-169d69ca32d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.427924 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aaaf574-a341-4efb-beae-169d69ca32d4-kube-api-access-wpjhg" (OuterVolumeSpecName: "kube-api-access-wpjhg") pod "3aaaf574-a341-4efb-beae-169d69ca32d4" (UID: "3aaaf574-a341-4efb-beae-169d69ca32d4"). InnerVolumeSpecName "kube-api-access-wpjhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.493547 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aaaf574-a341-4efb-beae-169d69ca32d4" (UID: "3aaaf574-a341-4efb-beae-169d69ca32d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.524747 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.524809 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aaaf574-a341-4efb-beae-169d69ca32d4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:13 crc kubenswrapper[4904]: I1205 20:15:13.524824 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpjhg\" (UniqueName: \"kubernetes.io/projected/3aaaf574-a341-4efb-beae-169d69ca32d4-kube-api-access-wpjhg\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:14 crc kubenswrapper[4904]: I1205 20:15:14.050756 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4tfw" event={"ID":"3aaaf574-a341-4efb-beae-169d69ca32d4","Type":"ContainerDied","Data":"9545307ffec1bda8606b993f7ddd633ccec45fb01e3dfc75013fd7b3fb805dfa"} Dec 05 20:15:14 crc kubenswrapper[4904]: I1205 20:15:14.050812 4904 scope.go:117] "RemoveContainer" containerID="a96e3d7eb8d401bf2049b822cda62f714f9d0e9aea743f8220923a51efa42215" Dec 05 20:15:14 crc kubenswrapper[4904]: I1205 20:15:14.050925 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4tfw" Dec 05 20:15:14 crc kubenswrapper[4904]: I1205 20:15:14.076094 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g4tfw"] Dec 05 20:15:14 crc kubenswrapper[4904]: I1205 20:15:14.080734 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g4tfw"] Dec 05 20:15:15 crc kubenswrapper[4904]: I1205 20:15:15.691978 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" path="/var/lib/kubelet/pods/3aaaf574-a341-4efb-beae-169d69ca32d4/volumes" Dec 05 20:15:15 crc kubenswrapper[4904]: I1205 20:15:15.916369 4904 scope.go:117] "RemoveContainer" containerID="beb8ef10802bde44a616822095c3c115322d6837446db6f5b0bdeaf4b0ac345d" Dec 05 20:15:15 crc kubenswrapper[4904]: I1205 20:15:15.951659 4904 scope.go:117] "RemoveContainer" containerID="db89cfbfcabbe0200a4a253b56413e0a972accdb4b58d4ac264251899c28d6d5" Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.078571 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plmk6" event={"ID":"9ef986b1-43e3-4c24-87d0-f385d1ae80bc","Type":"ContainerStarted","Data":"57eca7a550c07302289048bf4f61bb10a85f37651d40975a2cdad3f7da617145"} Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.081864 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpt4n" event={"ID":"d5752678-2a85-49a8-b6d4-63b4adb96277","Type":"ContainerStarted","Data":"12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658"} Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.085030 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerStarted","Data":"f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59"} Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.087857 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerStarted","Data":"e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f"} Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.103191 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-plmk6" podStartSLOduration=4.542247531 podStartE2EDuration="1m9.103166698s" podCreationTimestamp="2025-12-05 20:14:08 +0000 UTC" firstStartedPulling="2025-12-05 20:14:11.355217145 +0000 UTC m=+150.166433254" lastFinishedPulling="2025-12-05 20:15:15.916136302 +0000 UTC m=+214.727352421" observedRunningTime="2025-12-05 20:15:17.098879344 +0000 UTC m=+215.910095453" watchObservedRunningTime="2025-12-05 20:15:17.103166698 +0000 UTC m=+215.914382827" Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.124605 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jrg9c" podStartSLOduration=6.269390744 podStartE2EDuration="1m7.124572357s" podCreationTimestamp="2025-12-05 20:14:10 +0000 UTC" firstStartedPulling="2025-12-05 20:14:12.430093088 +0000 UTC m=+151.241309187" lastFinishedPulling="2025-12-05 20:15:13.285274681 +0000 UTC m=+212.096490800" observedRunningTime="2025-12-05 20:15:17.123464358 +0000 UTC m=+215.934680487" watchObservedRunningTime="2025-12-05 20:15:17.124572357 +0000 UTC m=+215.935788466" Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.147636 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-87qk8" podStartSLOduration=3.6673353779999998 podStartE2EDuration="1m6.14761155s" podCreationTimestamp="2025-12-05 20:14:11 +0000 UTC" firstStartedPulling="2025-12-05 20:14:13.45772014 +0000 UTC m=+152.268936249" lastFinishedPulling="2025-12-05 20:15:15.937996302 +0000 UTC m=+214.749212421" observedRunningTime="2025-12-05 20:15:17.146889201 +0000 UTC m=+215.958105310" watchObservedRunningTime="2025-12-05 20:15:17.14761155 +0000 UTC m=+215.958827659" Dec 05 20:15:17 crc kubenswrapper[4904]: I1205 20:15:17.169042 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gpt4n" podStartSLOduration=5.650355023 podStartE2EDuration="1m10.169025829s" podCreationTimestamp="2025-12-05 20:14:07 +0000 UTC" firstStartedPulling="2025-12-05 20:14:11.403485376 +0000 UTC m=+150.214701485" lastFinishedPulling="2025-12-05 20:15:15.922156192 +0000 UTC m=+214.733372291" observedRunningTime="2025-12-05 20:15:17.16603559 +0000 UTC m=+215.977251689" watchObservedRunningTime="2025-12-05 20:15:17.169025829 +0000 UTC m=+215.980241948" Dec 05 20:15:18 crc kubenswrapper[4904]: I1205 20:15:18.990698 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:15:18 crc kubenswrapper[4904]: I1205 20:15:18.990743 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:15:18 crc kubenswrapper[4904]: I1205 20:15:18.999295 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:15:18 crc kubenswrapper[4904]: I1205 20:15:18.999377 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:15:19 crc kubenswrapper[4904]: I1205 20:15:19.045210 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:15:19 crc kubenswrapper[4904]: I1205 20:15:19.065396 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:15:20 crc kubenswrapper[4904]: I1205 20:15:20.353701 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:15:20 crc kubenswrapper[4904]: I1205 20:15:20.354611 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:15:20 crc kubenswrapper[4904]: I1205 20:15:20.392432 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:15:21 crc kubenswrapper[4904]: I1205 20:15:21.152632 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:15:21 crc kubenswrapper[4904]: I1205 20:15:21.184565 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:15:21 crc kubenswrapper[4904]: I1205 20:15:21.184642 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:15:21 crc kubenswrapper[4904]: I1205 20:15:21.659558 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:15:21 crc kubenswrapper[4904]: I1205 20:15:21.660311 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:15:21 crc kubenswrapper[4904]: I1205 20:15:21.694113 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:15:22 crc kubenswrapper[4904]: I1205 20:15:22.165157 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:15:22 crc kubenswrapper[4904]: I1205 20:15:22.221081 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jrg9c" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="registry-server" probeResult="failure" output=< Dec 05 20:15:22 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 20:15:22 crc kubenswrapper[4904]: > Dec 05 20:15:22 crc kubenswrapper[4904]: I1205 20:15:22.803096 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fbc996c8d-z4fml"] Dec 05 20:15:22 crc kubenswrapper[4904]: I1205 20:15:22.803361 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" podUID="1b936de0-3def-401f-9706-2d10eba86b7b" containerName="controller-manager" containerID="cri-o://993c306b0a53db6e3f095d6cb493ab627a455ffb534af763631364edb6d9c407" gracePeriod=30 Dec 05 20:15:22 crc kubenswrapper[4904]: I1205 20:15:22.837900 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95"] Dec 05 20:15:22 crc kubenswrapper[4904]: I1205 20:15:22.838446 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" podUID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" containerName="route-controller-manager" containerID="cri-o://e845aeee616123ba7f1219a35910039f481168339ef05a1c1a770d71f5b2d461" gracePeriod=30 Dec 05 20:15:23 crc kubenswrapper[4904]: I1205 20:15:23.127453 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sr8r7"] Dec 05 20:15:24 crc kubenswrapper[4904]: I1205 20:15:24.022692 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerName="oauth-openshift" containerID="cri-o://b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830" gracePeriod=15 Dec 05 20:15:24 crc kubenswrapper[4904]: I1205 20:15:24.127847 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sr8r7" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="registry-server" containerID="cri-o://5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413" gracePeriod=2 Dec 05 20:15:24 crc kubenswrapper[4904]: I1205 20:15:24.513224 4904 patch_prober.go:28] interesting pod/route-controller-manager-67485fb985-q9s95 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 05 20:15:24 crc kubenswrapper[4904]: I1205 20:15:24.513352 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" podUID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 05 20:15:24 crc kubenswrapper[4904]: I1205 20:15:24.524990 4904 patch_prober.go:28] interesting pod/controller-manager-fbc996c8d-z4fml container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 05 20:15:24 crc kubenswrapper[4904]: I1205 20:15:24.525120 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" podUID="1b936de0-3def-401f-9706-2d10eba86b7b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 05 20:15:25 crc kubenswrapper[4904]: I1205 20:15:25.330099 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-87qk8"] Dec 05 20:15:25 crc kubenswrapper[4904]: I1205 20:15:25.330765 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-87qk8" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="registry-server" containerID="cri-o://e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f" gracePeriod=2 Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.965397 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.971867 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.979256 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.985540 4904 generic.go:334] "Generic (PLEG): container finished" podID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" containerID="e845aeee616123ba7f1219a35910039f481168339ef05a1c1a770d71f5b2d461" exitCode=0 Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.985620 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" event={"ID":"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9","Type":"ContainerDied","Data":"e845aeee616123ba7f1219a35910039f481168339ef05a1c1a770d71f5b2d461"} Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.985653 4904 scope.go:117] "RemoveContainer" containerID="e845aeee616123ba7f1219a35910039f481168339ef05a1c1a770d71f5b2d461" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.988518 4904 generic.go:334] "Generic (PLEG): container finished" podID="1b936de0-3def-401f-9706-2d10eba86b7b" containerID="993c306b0a53db6e3f095d6cb493ab627a455ffb534af763631364edb6d9c407" exitCode=0 Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.988574 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" event={"ID":"1b936de0-3def-401f-9706-2d10eba86b7b","Type":"ContainerDied","Data":"993c306b0a53db6e3f095d6cb493ab627a455ffb534af763631364edb6d9c407"} Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.988653 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbc996c8d-z4fml" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.996838 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c577b6ccf-qwzn4"] Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997052 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="extract-utilities" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997079 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="extract-utilities" Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997090 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b936de0-3def-401f-9706-2d10eba86b7b" containerName="controller-manager" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997096 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b936de0-3def-401f-9706-2d10eba86b7b" containerName="controller-manager" Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997104 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" containerName="route-controller-manager" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997109 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" containerName="route-controller-manager" Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997118 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerName="oauth-openshift" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997123 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerName="oauth-openshift" Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997138 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="registry-server" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997144 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="registry-server" Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997154 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28256b97-09a9-4ddf-a730-d2fc4d926310" containerName="collect-profiles" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997160 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="28256b97-09a9-4ddf-a730-d2fc4d926310" containerName="collect-profiles" Dec 05 20:15:28 crc kubenswrapper[4904]: E1205 20:15:28.997169 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="extract-content" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997176 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="extract-content" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997256 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aaaf574-a341-4efb-beae-169d69ca32d4" containerName="registry-server" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997266 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerName="oauth-openshift" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997274 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" containerName="route-controller-manager" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997284 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="28256b97-09a9-4ddf-a730-d2fc4d926310" containerName="collect-profiles" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997297 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b936de0-3def-401f-9706-2d10eba86b7b" containerName="controller-manager" Dec 05 20:15:28 crc kubenswrapper[4904]: I1205 20:15:28.997611 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.014792 4904 scope.go:117] "RemoveContainer" containerID="993c306b0a53db6e3f095d6cb493ab627a455ffb534af763631364edb6d9c407" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.018566 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c577b6ccf-qwzn4"] Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.053598 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.067970 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080017 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-config\") pod \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080092 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-ocp-branding-template\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080119 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-proxy-ca-bundles\") pod \"1b936de0-3def-401f-9706-2d10eba86b7b\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080151 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-service-ca\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080168 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-config\") pod \"1b936de0-3def-401f-9706-2d10eba86b7b\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080204 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7nmn\" (UniqueName: \"kubernetes.io/projected/1b936de0-3def-401f-9706-2d10eba86b7b-kube-api-access-m7nmn\") pod \"1b936de0-3def-401f-9706-2d10eba86b7b\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080230 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-serving-cert\") pod \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080264 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-login\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080294 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-error\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080319 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-dir\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080341 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-cliconfig\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080367 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-session\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080402 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-trusted-ca-bundle\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080426 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-client-ca\") pod \"1b936de0-3def-401f-9706-2d10eba86b7b\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080450 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-router-certs\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080478 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-policies\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080510 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-serving-cert\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080538 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-client-ca\") pod \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080574 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jptjx\" (UniqueName: \"kubernetes.io/projected/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-kube-api-access-jptjx\") pod \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\" (UID: \"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080597 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-provider-selection\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080625 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b936de0-3def-401f-9706-2d10eba86b7b-serving-cert\") pod \"1b936de0-3def-401f-9706-2d10eba86b7b\" (UID: \"1b936de0-3def-401f-9706-2d10eba86b7b\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080651 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-idp-0-file-data\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.080676 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svtxh\" (UniqueName: \"kubernetes.io/projected/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-kube-api-access-svtxh\") pod \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\" (UID: \"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.081513 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.082169 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" (UID: "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.082603 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-config" (OuterVolumeSpecName: "config") pod "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" (UID: "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.083293 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1b936de0-3def-401f-9706-2d10eba86b7b" (UID: "1b936de0-3def-401f-9706-2d10eba86b7b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.083557 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.083921 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b936de0-3def-401f-9706-2d10eba86b7b" (UID: "1b936de0-3def-401f-9706-2d10eba86b7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.084492 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.088476 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b936de0-3def-401f-9706-2d10eba86b7b-kube-api-access-m7nmn" (OuterVolumeSpecName: "kube-api-access-m7nmn") pod "1b936de0-3def-401f-9706-2d10eba86b7b" (UID: "1b936de0-3def-401f-9706-2d10eba86b7b"). InnerVolumeSpecName "kube-api-access-m7nmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.088986 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.089415 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.089734 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.090080 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.090504 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-config" (OuterVolumeSpecName: "config") pod "1b936de0-3def-401f-9706-2d10eba86b7b" (UID: "1b936de0-3def-401f-9706-2d10eba86b7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.090622 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b936de0-3def-401f-9706-2d10eba86b7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b936de0-3def-401f-9706-2d10eba86b7b" (UID: "1b936de0-3def-401f-9706-2d10eba86b7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.091130 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" (UID: "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.091195 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.097435 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.097993 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.098491 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-kube-api-access-jptjx" (OuterVolumeSpecName: "kube-api-access-jptjx") pod "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" (UID: "afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9"). InnerVolumeSpecName "kube-api-access-jptjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.099069 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.099539 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-kube-api-access-svtxh" (OuterVolumeSpecName: "kube-api-access-svtxh") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "kube-api-access-svtxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.106164 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.106438 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" (UID: "65a40f5b-e76f-4c3b-a50a-ef1641ea6a68"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183192 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-config\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183271 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d607a691-844f-42f7-a9cf-e40f72b57220-serving-cert\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183304 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-client-ca\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183343 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-proxy-ca-bundles\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183387 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxbl\" (UniqueName: \"kubernetes.io/projected/d607a691-844f-42f7-a9cf-e40f72b57220-kube-api-access-vvxbl\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183477 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183517 4904 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183549 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183566 4904 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183578 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jptjx\" (UniqueName: \"kubernetes.io/projected/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-kube-api-access-jptjx\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183597 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183609 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b936de0-3def-401f-9706-2d10eba86b7b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183619 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183629 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svtxh\" (UniqueName: \"kubernetes.io/projected/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-kube-api-access-svtxh\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183638 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183648 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183657 4904 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183667 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183676 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183686 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7nmn\" (UniqueName: \"kubernetes.io/projected/1b936de0-3def-401f-9706-2d10eba86b7b-kube-api-access-m7nmn\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183694 4904 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183704 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183719 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183728 4904 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183737 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183746 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183755 4904 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.183764 4904 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b936de0-3def-401f-9706-2d10eba86b7b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.284639 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-config\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.284695 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d607a691-844f-42f7-a9cf-e40f72b57220-serving-cert\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.284725 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-client-ca\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.284749 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-proxy-ca-bundles\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.284792 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxbl\" (UniqueName: \"kubernetes.io/projected/d607a691-844f-42f7-a9cf-e40f72b57220-kube-api-access-vvxbl\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.286138 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-proxy-ca-bundles\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.286258 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-client-ca\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.287300 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d607a691-844f-42f7-a9cf-e40f72b57220-config\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.291117 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d607a691-844f-42f7-a9cf-e40f72b57220-serving-cert\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.301145 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxbl\" (UniqueName: \"kubernetes.io/projected/d607a691-844f-42f7-a9cf-e40f72b57220-kube-api-access-vvxbl\") pod \"controller-manager-c577b6ccf-qwzn4\" (UID: \"d607a691-844f-42f7-a9cf-e40f72b57220\") " pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.310042 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sr8r7_2ea52ad5-455a-4f4b-a576-fa76c35bcbea/registry-server/0.log" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.310748 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.322070 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fbc996c8d-z4fml"] Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.324686 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fbc996c8d-z4fml"] Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.339815 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.487660 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghsvd\" (UniqueName: \"kubernetes.io/projected/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-kube-api-access-ghsvd\") pod \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.487931 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-catalog-content\") pod \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.487996 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-utilities\") pod \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\" (UID: \"2ea52ad5-455a-4f4b-a576-fa76c35bcbea\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.489181 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-utilities" (OuterVolumeSpecName: "utilities") pod "2ea52ad5-455a-4f4b-a576-fa76c35bcbea" (UID: "2ea52ad5-455a-4f4b-a576-fa76c35bcbea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.492647 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-kube-api-access-ghsvd" (OuterVolumeSpecName: "kube-api-access-ghsvd") pod "2ea52ad5-455a-4f4b-a576-fa76c35bcbea" (UID: "2ea52ad5-455a-4f4b-a576-fa76c35bcbea"). InnerVolumeSpecName "kube-api-access-ghsvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.506890 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ea52ad5-455a-4f4b-a576-fa76c35bcbea" (UID: "2ea52ad5-455a-4f4b-a576-fa76c35bcbea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.549996 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qk8_ed3e3607-e411-47d4-8a20-cbd6623ec89e/registry-server/0.log" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.551509 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.597482 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-catalog-content\") pod \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.597546 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-utilities\") pod \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.597587 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d24f9\" (UniqueName: \"kubernetes.io/projected/ed3e3607-e411-47d4-8a20-cbd6623ec89e-kube-api-access-d24f9\") pod \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\" (UID: \"ed3e3607-e411-47d4-8a20-cbd6623ec89e\") " Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.597926 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.597948 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghsvd\" (UniqueName: \"kubernetes.io/projected/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-kube-api-access-ghsvd\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.597961 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea52ad5-455a-4f4b-a576-fa76c35bcbea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.600503 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-utilities" (OuterVolumeSpecName: "utilities") pod "ed3e3607-e411-47d4-8a20-cbd6623ec89e" (UID: "ed3e3607-e411-47d4-8a20-cbd6623ec89e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.603029 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3e3607-e411-47d4-8a20-cbd6623ec89e-kube-api-access-d24f9" (OuterVolumeSpecName: "kube-api-access-d24f9") pod "ed3e3607-e411-47d4-8a20-cbd6623ec89e" (UID: "ed3e3607-e411-47d4-8a20-cbd6623ec89e"). InnerVolumeSpecName "kube-api-access-d24f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.689606 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b936de0-3def-401f-9706-2d10eba86b7b" path="/var/lib/kubelet/pods/1b936de0-3def-401f-9706-2d10eba86b7b/volumes" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.698994 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.699033 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d24f9\" (UniqueName: \"kubernetes.io/projected/ed3e3607-e411-47d4-8a20-cbd6623ec89e-kube-api-access-d24f9\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.716448 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed3e3607-e411-47d4-8a20-cbd6623ec89e" (UID: "ed3e3607-e411-47d4-8a20-cbd6623ec89e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.738035 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c577b6ccf-qwzn4"] Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.799515 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3e3607-e411-47d4-8a20-cbd6623ec89e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.955718 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.955768 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.955810 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.956346 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.956404 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438" gracePeriod=600 Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.995170 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qk8_ed3e3607-e411-47d4-8a20-cbd6623ec89e/registry-server/0.log" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.996344 4904 generic.go:334] "Generic (PLEG): container finished" podID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerID="e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f" exitCode=137 Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.996455 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qk8" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.996642 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerDied","Data":"e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f"} Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.996693 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qk8" event={"ID":"ed3e3607-e411-47d4-8a20-cbd6623ec89e","Type":"ContainerDied","Data":"e076caa621902641fc9067f112bc170eb600d6c903052cd16f79c7f825301b38"} Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.996716 4904 scope.go:117] "RemoveContainer" containerID="e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f" Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.999145 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" event={"ID":"d607a691-844f-42f7-a9cf-e40f72b57220","Type":"ContainerStarted","Data":"92193c90bc4f33d315a7d4e7edecf0b4029dd86dcd8b11fea3be710e000d8f69"} Dec 05 20:15:29 crc kubenswrapper[4904]: I1205 20:15:29.999171 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" event={"ID":"d607a691-844f-42f7-a9cf-e40f72b57220","Type":"ContainerStarted","Data":"3fe02e0d4815df0208698984e1064634abad8d3441dfc57745a5e5497b246ccd"} Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.001121 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.003467 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sr8r7_2ea52ad5-455a-4f4b-a576-fa76c35bcbea/registry-server/0.log" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.004154 4904 generic.go:334] "Generic (PLEG): container finished" podID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerID="5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413" exitCode=137 Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.004212 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sr8r7" event={"ID":"2ea52ad5-455a-4f4b-a576-fa76c35bcbea","Type":"ContainerDied","Data":"5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413"} Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.004231 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sr8r7" event={"ID":"2ea52ad5-455a-4f4b-a576-fa76c35bcbea","Type":"ContainerDied","Data":"0f0a396484963c3ad377880a609ba618648358d34310c7a5ad7b34ccf37082c5"} Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.004301 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sr8r7" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.006428 4904 generic.go:334] "Generic (PLEG): container finished" podID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" containerID="b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830" exitCode=0 Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.006479 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" event={"ID":"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68","Type":"ContainerDied","Data":"b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830"} Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.006505 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" event={"ID":"65a40f5b-e76f-4c3b-a50a-ef1641ea6a68","Type":"ContainerDied","Data":"fc46a78b2952ed56578fc9e52b3638fcdff71d357c2f296460a6fb47d9ce9c9f"} Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.006580 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s87dr" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.013842 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" event={"ID":"afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9","Type":"ContainerDied","Data":"650993f34820ae08acc7c3c35d88a9fc038b529ed5b962f47af6fc7425a9b466"} Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.013943 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.014257 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.019952 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c577b6ccf-qwzn4" podStartSLOduration=8.019916265 podStartE2EDuration="8.019916265s" podCreationTimestamp="2025-12-05 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:30.018305452 +0000 UTC m=+228.829521581" watchObservedRunningTime="2025-12-05 20:15:30.019916265 +0000 UTC m=+228.831132374" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.029754 4904 scope.go:117] "RemoveContainer" containerID="9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.073222 4904 scope.go:117] "RemoveContainer" containerID="245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.094218 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s87dr"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.098114 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s87dr"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.111492 4904 scope.go:117] "RemoveContainer" containerID="e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.112008 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f\": container with ID starting with e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f not found: ID does not exist" containerID="e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.112031 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f"} err="failed to get container status \"e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f\": rpc error: code = NotFound desc = could not find container \"e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f\": container with ID starting with e6010bc02dc1cb0b219d7f027ab1224de30a26e0cb126f4aa2da7b56e659be8f not found: ID does not exist" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.112048 4904 scope.go:117] "RemoveContainer" containerID="9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.112882 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f\": container with ID starting with 9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f not found: ID does not exist" containerID="9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.112907 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f"} err="failed to get container status \"9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f\": rpc error: code = NotFound desc = could not find container \"9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f\": container with ID starting with 9b71ed19dcdc77a65753fb99d5e5e18b9b1751fe81374aab28e6d61b7727514f not found: ID does not exist" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.112922 4904 scope.go:117] "RemoveContainer" containerID="245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.113331 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac\": container with ID starting with 245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac not found: ID does not exist" containerID="245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.113360 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac"} err="failed to get container status \"245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac\": rpc error: code = NotFound desc = could not find container \"245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac\": container with ID starting with 245ee7f061f7e59aefe537b88c4f00508c01b4c0a49a7a61731bbcf9c1f24bac not found: ID does not exist" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.113378 4904 scope.go:117] "RemoveContainer" containerID="5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.121924 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sr8r7"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.124170 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sr8r7"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.129099 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-87qk8"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.129502 4904 scope.go:117] "RemoveContainer" containerID="b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.130815 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-87qk8"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.141873 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.148560 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-q9s95"] Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.154469 4904 scope.go:117] "RemoveContainer" containerID="ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.169382 4904 scope.go:117] "RemoveContainer" containerID="5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.170051 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413\": container with ID starting with 5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413 not found: ID does not exist" containerID="5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.170106 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413"} err="failed to get container status \"5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413\": rpc error: code = NotFound desc = could not find container \"5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413\": container with ID starting with 5f4c80db8f3cd3cda5f0383f2b0955799e80a6e381295eb277d59e5136603413 not found: ID does not exist" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.170137 4904 scope.go:117] "RemoveContainer" containerID="b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.170841 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3\": container with ID starting with b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3 not found: ID does not exist" containerID="b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.170885 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3"} err="failed to get container status \"b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3\": rpc error: code = NotFound desc = could not find container \"b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3\": container with ID starting with b1b2fe9492212318d9bda408230a8dbdfdc493c061b9e5275f9bd08381363cf3 not found: ID does not exist" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.170913 4904 scope.go:117] "RemoveContainer" containerID="ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.172299 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7\": container with ID starting with ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7 not found: ID does not exist" containerID="ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.172332 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7"} err="failed to get container status \"ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7\": rpc error: code = NotFound desc = could not find container \"ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7\": container with ID starting with ccbd208577e829a7a598ede70f204b8b3897c115da383d0f00919ddb82f87bd7 not found: ID does not exist" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.172354 4904 scope.go:117] "RemoveContainer" containerID="b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.192013 4904 scope.go:117] "RemoveContainer" containerID="b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830" Dec 05 20:15:30 crc kubenswrapper[4904]: E1205 20:15:30.192670 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830\": container with ID starting with b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830 not found: ID does not exist" containerID="b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830" Dec 05 20:15:30 crc kubenswrapper[4904]: I1205 20:15:30.192709 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830"} err="failed to get container status \"b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830\": rpc error: code = NotFound desc = could not find container \"b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830\": container with ID starting with b700e027309894d7611a6b1df2a865acb52645725c69a51680fc9c4970925830 not found: ID does not exist" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.021441 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438" exitCode=0 Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.021550 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438"} Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.021906 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"124e7b18b562ea9691adfe9edd2ebe0554085fb149cd56e77b617510a754d1be"} Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.166568 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn"] Dec 05 20:15:31 crc kubenswrapper[4904]: E1205 20:15:31.167178 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="extract-content" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167201 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="extract-content" Dec 05 20:15:31 crc kubenswrapper[4904]: E1205 20:15:31.167214 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="registry-server" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167223 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="registry-server" Dec 05 20:15:31 crc kubenswrapper[4904]: E1205 20:15:31.167237 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="extract-utilities" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167245 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="extract-utilities" Dec 05 20:15:31 crc kubenswrapper[4904]: E1205 20:15:31.167259 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="registry-server" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167266 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="registry-server" Dec 05 20:15:31 crc kubenswrapper[4904]: E1205 20:15:31.167280 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="extract-utilities" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167288 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="extract-utilities" Dec 05 20:15:31 crc kubenswrapper[4904]: E1205 20:15:31.167298 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="extract-content" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167306 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="extract-content" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167426 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" containerName="registry-server" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167436 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" containerName="registry-server" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.167875 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.169896 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.171743 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.172024 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.172616 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.172865 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.173054 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.182190 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn"] Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.220927 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8305b0-4e62-4d33-b963-02b559fdb553-client-ca\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.221019 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8305b0-4e62-4d33-b963-02b559fdb553-serving-cert\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.234863 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8305b0-4e62-4d33-b963-02b559fdb553-config\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.235032 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk7bs\" (UniqueName: \"kubernetes.io/projected/4e8305b0-4e62-4d33-b963-02b559fdb553-kube-api-access-mk7bs\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.245267 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.285812 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.336177 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk7bs\" (UniqueName: \"kubernetes.io/projected/4e8305b0-4e62-4d33-b963-02b559fdb553-kube-api-access-mk7bs\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.336270 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8305b0-4e62-4d33-b963-02b559fdb553-client-ca\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.336333 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8305b0-4e62-4d33-b963-02b559fdb553-serving-cert\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.337314 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8305b0-4e62-4d33-b963-02b559fdb553-client-ca\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.337407 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8305b0-4e62-4d33-b963-02b559fdb553-config\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.338364 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8305b0-4e62-4d33-b963-02b559fdb553-config\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.342683 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8305b0-4e62-4d33-b963-02b559fdb553-serving-cert\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.355500 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk7bs\" (UniqueName: \"kubernetes.io/projected/4e8305b0-4e62-4d33-b963-02b559fdb553-kube-api-access-mk7bs\") pod \"route-controller-manager-c94b5c6d-w5gqn\" (UID: \"4e8305b0-4e62-4d33-b963-02b559fdb553\") " pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.519341 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.700265 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea52ad5-455a-4f4b-a576-fa76c35bcbea" path="/var/lib/kubelet/pods/2ea52ad5-455a-4f4b-a576-fa76c35bcbea/volumes" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.703652 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a40f5b-e76f-4c3b-a50a-ef1641ea6a68" path="/var/lib/kubelet/pods/65a40f5b-e76f-4c3b-a50a-ef1641ea6a68/volumes" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.704674 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9" path="/var/lib/kubelet/pods/afe37ec9-fff6-4d9b-a7ae-37114bfcd9c9/volumes" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.705656 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed3e3607-e411-47d4-8a20-cbd6623ec89e" path="/var/lib/kubelet/pods/ed3e3607-e411-47d4-8a20-cbd6623ec89e/volumes" Dec 05 20:15:31 crc kubenswrapper[4904]: I1205 20:15:31.912760 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn"] Dec 05 20:15:31 crc kubenswrapper[4904]: W1205 20:15:31.924186 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e8305b0_4e62_4d33_b963_02b559fdb553.slice/crio-1df3ab6fb7cf6ec5df1c9db255ddee4f80360a3e0a31a3300e2b181fc32bd1ac WatchSource:0}: Error finding container 1df3ab6fb7cf6ec5df1c9db255ddee4f80360a3e0a31a3300e2b181fc32bd1ac: Status 404 returned error can't find the container with id 1df3ab6fb7cf6ec5df1c9db255ddee4f80360a3e0a31a3300e2b181fc32bd1ac Dec 05 20:15:32 crc kubenswrapper[4904]: I1205 20:15:32.035080 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" event={"ID":"4e8305b0-4e62-4d33-b963-02b559fdb553","Type":"ContainerStarted","Data":"1df3ab6fb7cf6ec5df1c9db255ddee4f80360a3e0a31a3300e2b181fc32bd1ac"} Dec 05 20:15:32 crc kubenswrapper[4904]: I1205 20:15:32.875507 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plmk6"] Dec 05 20:15:32 crc kubenswrapper[4904]: I1205 20:15:32.876103 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-plmk6" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="registry-server" containerID="cri-o://57eca7a550c07302289048bf4f61bb10a85f37651d40975a2cdad3f7da617145" gracePeriod=2 Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.043643 4904 generic.go:334] "Generic (PLEG): container finished" podID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerID="57eca7a550c07302289048bf4f61bb10a85f37651d40975a2cdad3f7da617145" exitCode=0 Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.043708 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plmk6" event={"ID":"9ef986b1-43e3-4c24-87d0-f385d1ae80bc","Type":"ContainerDied","Data":"57eca7a550c07302289048bf4f61bb10a85f37651d40975a2cdad3f7da617145"} Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.045904 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" event={"ID":"4e8305b0-4e62-4d33-b963-02b559fdb553","Type":"ContainerStarted","Data":"e8a91724deaa0d7a5cc55c3393636d50ac2149b310be36205786aebc76645b16"} Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.046199 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.053084 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.066348 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c94b5c6d-w5gqn" podStartSLOduration=11.066325273 podStartE2EDuration="11.066325273s" podCreationTimestamp="2025-12-05 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:33.065328186 +0000 UTC m=+231.876544315" watchObservedRunningTime="2025-12-05 20:15:33.066325273 +0000 UTC m=+231.877541382" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.418518 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.581860 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7gj\" (UniqueName: \"kubernetes.io/projected/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-kube-api-access-vl7gj\") pod \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.581988 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-utilities\") pod \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.582032 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-catalog-content\") pod \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\" (UID: \"9ef986b1-43e3-4c24-87d0-f385d1ae80bc\") " Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.585037 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-utilities" (OuterVolumeSpecName: "utilities") pod "9ef986b1-43e3-4c24-87d0-f385d1ae80bc" (UID: "9ef986b1-43e3-4c24-87d0-f385d1ae80bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.598262 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-kube-api-access-vl7gj" (OuterVolumeSpecName: "kube-api-access-vl7gj") pod "9ef986b1-43e3-4c24-87d0-f385d1ae80bc" (UID: "9ef986b1-43e3-4c24-87d0-f385d1ae80bc"). InnerVolumeSpecName "kube-api-access-vl7gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.635357 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ef986b1-43e3-4c24-87d0-f385d1ae80bc" (UID: "9ef986b1-43e3-4c24-87d0-f385d1ae80bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.683959 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl7gj\" (UniqueName: \"kubernetes.io/projected/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-kube-api-access-vl7gj\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.684015 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:33 crc kubenswrapper[4904]: I1205 20:15:33.684029 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef986b1-43e3-4c24-87d0-f385d1ae80bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.054774 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plmk6" event={"ID":"9ef986b1-43e3-4c24-87d0-f385d1ae80bc","Type":"ContainerDied","Data":"d40ba7389acd9ba88690544b173451d098a0f047d7b9d1e8ae942d479cff1bad"} Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.054821 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plmk6" Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.055046 4904 scope.go:117] "RemoveContainer" containerID="57eca7a550c07302289048bf4f61bb10a85f37651d40975a2cdad3f7da617145" Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.071791 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plmk6"] Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.076693 4904 scope.go:117] "RemoveContainer" containerID="f547825116d98eb4340141392d9e288a77375530b1d6f9182462333f4f615c40" Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.078347 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-plmk6"] Dec 05 20:15:34 crc kubenswrapper[4904]: I1205 20:15:34.090777 4904 scope.go:117] "RemoveContainer" containerID="5bfd10e848c36742e6a05d12f89ad1b0b1a399b6abaefa93658d5a28398d8f64" Dec 05 20:15:35 crc kubenswrapper[4904]: I1205 20:15:35.687782 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" path="/var/lib/kubelet/pods/9ef986b1-43e3-4c24-87d0-f385d1ae80bc/volumes" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.147837 4904 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.148306 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="extract-content" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.148318 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="extract-content" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.148329 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="registry-server" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.148334 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="registry-server" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.148348 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="extract-utilities" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.148361 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="extract-utilities" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.148451 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef986b1-43e3-4c24-87d0-f385d1ae80bc" containerName="registry-server" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.148781 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.149746 4904 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.150047 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899" gracePeriod=15 Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.150127 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46" gracePeriod=15 Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.150116 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9" gracePeriod=15 Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.150164 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316" gracePeriod=15 Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.150188 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173" gracePeriod=15 Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152410 4904 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152663 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152680 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152697 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152706 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152721 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152729 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152737 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152744 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152754 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152763 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152779 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152786 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152796 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152803 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.152818 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152827 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152962 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152979 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.152990 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.153000 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.153008 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.153019 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.153240 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.175342 4904 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]log ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]api-openshift-apiserver-available ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]api-openshift-oauth-apiserver-available ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]informer-sync ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-apiextensions-informers ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/crd-informer-synced ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/rbac/bootstrap-roles ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/bootstrap-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-registration-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]autoregister-completion ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 20:15:36 crc kubenswrapper[4904]: [-]shutdown failed: reason withheld Dec 05 20:15:36 crc kubenswrapper[4904]: readyz check failed Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.175412 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.191732 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217407 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217459 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217482 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217497 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217516 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217537 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217563 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.217595 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320142 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320139 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320245 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320277 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320283 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320305 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320382 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320355 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320460 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320477 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320506 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320490 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320578 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320525 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320721 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.320932 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: I1205 20:15:36.487924 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:36 crc kubenswrapper[4904]: W1205 20:15:36.505516 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-310b311d5e23f73e1fa1914b3df262071e00b13a402efabc4c9a947b817c9ca8 WatchSource:0}: Error finding container 310b311d5e23f73e1fa1914b3df262071e00b13a402efabc4c9a947b817c9ca8: Status 404 returned error can't find the container with id 310b311d5e23f73e1fa1914b3df262071e00b13a402efabc4c9a947b817c9ca8 Dec 05 20:15:36 crc kubenswrapper[4904]: E1205 20:15:36.508087 4904 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e6afc7543c910 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:15:36.507492624 +0000 UTC m=+235.318708733,LastTimestamp:2025-12-05 20:15:36.507492624 +0000 UTC m=+235.318708733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.074732 4904 generic.go:334] "Generic (PLEG): container finished" podID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" containerID="11c3fb8aab8fb03ee3058cc38f7911521e7aea209a63e9b7ddff1afd7a72e300" exitCode=0 Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.074965 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9","Type":"ContainerDied","Data":"11c3fb8aab8fb03ee3058cc38f7911521e7aea209a63e9b7ddff1afd7a72e300"} Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.075696 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.076128 4904 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.076441 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.077966 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.079591 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.080588 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316" exitCode=0 Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.080612 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899" exitCode=0 Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.080622 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9" exitCode=0 Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.080630 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173" exitCode=2 Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.080691 4904 scope.go:117] "RemoveContainer" containerID="eaedab35f18f416d6f9be2e86206bdd1fff404a401ba12d1b5efce32349a1ffe" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.082693 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f9a9ab07f8c32a7927a4def344ba7542179305cde70f06a0316d4edd3f91620b"} Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.082720 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"310b311d5e23f73e1fa1914b3df262071e00b13a402efabc4c9a947b817c9ca8"} Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.083913 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.084095 4904 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:37 crc kubenswrapper[4904]: I1205 20:15:37.084308 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.091759 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.520270 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.521342 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.521681 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.525963 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.527898 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.528581 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.529045 4904 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.529547 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547149 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kubelet-dir\") pod \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547198 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547240 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-var-lock\") pod \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547236 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" (UID: "01612d5b-cc85-4376-a8a4-c1fbe2ccabd9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547254 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547279 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547306 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-var-lock" (OuterVolumeSpecName: "var-lock") pod "01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" (UID: "01612d5b-cc85-4376-a8a4-c1fbe2ccabd9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547293 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547363 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kube-api-access\") pod \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\" (UID: \"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9\") " Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547407 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547495 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547768 4904 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547804 4904 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547815 4904 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547823 4904 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.547830 4904 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.552242 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" (UID: "01612d5b-cc85-4376-a8a4-c1fbe2ccabd9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:38 crc kubenswrapper[4904]: I1205 20:15:38.648272 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01612d5b-cc85-4376-a8a4-c1fbe2ccabd9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.102844 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.103773 4904 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46" exitCode=0 Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.103850 4904 scope.go:117] "RemoveContainer" containerID="a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.103902 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.105755 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"01612d5b-cc85-4376-a8a4-c1fbe2ccabd9","Type":"ContainerDied","Data":"12a3952f587f09f806145f7b7303b190ebf667b5f3b8b9800ccf915c39c7b7dd"} Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.105783 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12a3952f587f09f806145f7b7303b190ebf667b5f3b8b9800ccf915c39c7b7dd" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.105833 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.122001 4904 scope.go:117] "RemoveContainer" containerID="b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.126149 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.126641 4904 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.126998 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.132434 4904 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.133089 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.133691 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.136376 4904 scope.go:117] "RemoveContainer" containerID="07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.152522 4904 scope.go:117] "RemoveContainer" containerID="9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.169432 4904 scope.go:117] "RemoveContainer" containerID="097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.187425 4904 scope.go:117] "RemoveContainer" containerID="645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.207988 4904 scope.go:117] "RemoveContainer" containerID="a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316" Dec 05 20:15:39 crc kubenswrapper[4904]: E1205 20:15:39.208472 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\": container with ID starting with a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316 not found: ID does not exist" containerID="a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.208543 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316"} err="failed to get container status \"a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\": rpc error: code = NotFound desc = could not find container \"a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316\": container with ID starting with a972523cbe1d769fb13c365f030eab38907c5f877d667e034c62c5f3a0e43316 not found: ID does not exist" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.208571 4904 scope.go:117] "RemoveContainer" containerID="b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899" Dec 05 20:15:39 crc kubenswrapper[4904]: E1205 20:15:39.208898 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\": container with ID starting with b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899 not found: ID does not exist" containerID="b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.208922 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899"} err="failed to get container status \"b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\": rpc error: code = NotFound desc = could not find container \"b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899\": container with ID starting with b22d909d883896d45316ef7882eff8eb83356385d9b75add19f2f6539bc6b899 not found: ID does not exist" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.208938 4904 scope.go:117] "RemoveContainer" containerID="07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9" Dec 05 20:15:39 crc kubenswrapper[4904]: E1205 20:15:39.209324 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\": container with ID starting with 07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9 not found: ID does not exist" containerID="07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.209352 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9"} err="failed to get container status \"07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\": rpc error: code = NotFound desc = could not find container \"07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9\": container with ID starting with 07ea6290fb6d9c45c6ee91e7b2afc46b9056839cc16737c1aa1b5c63fa2ab2e9 not found: ID does not exist" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.209367 4904 scope.go:117] "RemoveContainer" containerID="9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173" Dec 05 20:15:39 crc kubenswrapper[4904]: E1205 20:15:39.209586 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\": container with ID starting with 9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173 not found: ID does not exist" containerID="9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.209624 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173"} err="failed to get container status \"9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\": rpc error: code = NotFound desc = could not find container \"9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173\": container with ID starting with 9dd5c7d377d405e6b467d584c63979d87eeb2ae602988030cdd315560b962173 not found: ID does not exist" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.209650 4904 scope.go:117] "RemoveContainer" containerID="097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46" Dec 05 20:15:39 crc kubenswrapper[4904]: E1205 20:15:39.209886 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\": container with ID starting with 097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46 not found: ID does not exist" containerID="097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.209914 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46"} err="failed to get container status \"097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\": rpc error: code = NotFound desc = could not find container \"097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46\": container with ID starting with 097b431165a52aa94e07b28bb0accf6749414684b30ad916faeb201f5b6a9a46 not found: ID does not exist" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.209931 4904 scope.go:117] "RemoveContainer" containerID="645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969" Dec 05 20:15:39 crc kubenswrapper[4904]: E1205 20:15:39.210376 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\": container with ID starting with 645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969 not found: ID does not exist" containerID="645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.210412 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969"} err="failed to get container status \"645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\": rpc error: code = NotFound desc = could not find container \"645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969\": container with ID starting with 645c600236c5ea8c5a445ad6cb05f98aa74345c47ec3947fbe0e14fc96f98969 not found: ID does not exist" Dec 05 20:15:39 crc kubenswrapper[4904]: I1205 20:15:39.690289 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 20:15:41 crc kubenswrapper[4904]: I1205 20:15:41.683237 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:41 crc kubenswrapper[4904]: I1205 20:15:41.684179 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:42 crc kubenswrapper[4904]: E1205 20:15:42.167377 4904 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e6afc7543c910 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:15:36.507492624 +0000 UTC m=+235.318708733,LastTimestamp:2025-12-05 20:15:36.507492624 +0000 UTC m=+235.318708733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.249296 4904 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.250294 4904 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.251184 4904 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.251488 4904 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.251756 4904 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.251787 4904 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.252079 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="200ms" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.453736 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="400ms" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.680703 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.681605 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.682374 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.695190 4904 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.695221 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.695643 4904 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:46 crc kubenswrapper[4904]: I1205 20:15:46.696338 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:46 crc kubenswrapper[4904]: W1205 20:15:46.724486 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e96bfffa0ca3a8086497a3e6c744930444be7bf1d4533fc4a6e517da05f369cc WatchSource:0}: Error finding container e96bfffa0ca3a8086497a3e6c744930444be7bf1d4533fc4a6e517da05f369cc: Status 404 returned error can't find the container with id e96bfffa0ca3a8086497a3e6c744930444be7bf1d4533fc4a6e517da05f369cc Dec 05 20:15:46 crc kubenswrapper[4904]: E1205 20:15:46.854661 4904 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="800ms" Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.150083 4904 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c5700bd9a311cfa8d6a00ef615b9c208041788739651f14efb82eb623c801f6f" exitCode=0 Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.150163 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c5700bd9a311cfa8d6a00ef615b9c208041788739651f14efb82eb623c801f6f"} Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.150457 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e96bfffa0ca3a8086497a3e6c744930444be7bf1d4533fc4a6e517da05f369cc"} Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.150759 4904 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.150776 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:47 crc kubenswrapper[4904]: E1205 20:15:47.151252 4904 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.151322 4904 status_manager.go:851] "Failed to get status for pod" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:47 crc kubenswrapper[4904]: I1205 20:15:47.151498 4904 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Dec 05 20:15:48 crc kubenswrapper[4904]: I1205 20:15:48.162505 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f1e5194c63f0670d8176f234c5bcebd8d6651adcbd1a19715e7de493b537a06"} Dec 05 20:15:48 crc kubenswrapper[4904]: I1205 20:15:48.162791 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"23b5a31f4a50a7da9fbcc3f20c72569bf581709028f11a9599daaa2ce326c200"} Dec 05 20:15:48 crc kubenswrapper[4904]: I1205 20:15:48.162801 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c05cad7691adddb362e51c653d765f690d314a8efb93e776db5061cde6366ac2"} Dec 05 20:15:48 crc kubenswrapper[4904]: I1205 20:15:48.162811 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9299fafff045f4ca465eff793d8a292a536ea011fce0884e3be25d2bcc32c8f0"} Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.170379 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d6ea4d582ff1b8c9350e488c8a8f47c980c8db4ac9d9316f3259d11916c5e5d9"} Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.170761 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.170828 4904 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.170853 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.173832 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.173865 4904 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a" exitCode=1 Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.173885 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a"} Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.174250 4904 scope.go:117] "RemoveContainer" containerID="08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a" Dec 05 20:15:49 crc kubenswrapper[4904]: I1205 20:15:49.474691 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:50 crc kubenswrapper[4904]: I1205 20:15:50.184872 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:15:50 crc kubenswrapper[4904]: I1205 20:15:50.184934 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5bde1e94a05f4bc776696c78cd1c1011b3ced1d5330e1be63db03a11d60bbf3"} Dec 05 20:15:50 crc kubenswrapper[4904]: I1205 20:15:50.375145 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:50 crc kubenswrapper[4904]: I1205 20:15:50.375390 4904 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:15:50 crc kubenswrapper[4904]: I1205 20:15:50.375447 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:15:50 crc kubenswrapper[4904]: I1205 20:15:50.659956 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:51 crc kubenswrapper[4904]: I1205 20:15:51.696776 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:51 crc kubenswrapper[4904]: I1205 20:15:51.697168 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:51 crc kubenswrapper[4904]: I1205 20:15:51.704356 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:54 crc kubenswrapper[4904]: I1205 20:15:54.184969 4904 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:54 crc kubenswrapper[4904]: I1205 20:15:54.208962 4904 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:54 crc kubenswrapper[4904]: I1205 20:15:54.209010 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:54 crc kubenswrapper[4904]: I1205 20:15:54.216249 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:54 crc kubenswrapper[4904]: I1205 20:15:54.301113 4904 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bd567287-36c2-4019-bf43-2336255ee12d" Dec 05 20:15:55 crc kubenswrapper[4904]: I1205 20:15:55.215699 4904 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:55 crc kubenswrapper[4904]: I1205 20:15:55.216174 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:15:55 crc kubenswrapper[4904]: I1205 20:15:55.219557 4904 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bd567287-36c2-4019-bf43-2336255ee12d" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.253228 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.298676 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.355149 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.375356 4904 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.375778 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.519093 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.536557 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.567936 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.623320 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.732286 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.738132 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.756924 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.826009 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.918287 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:16:00 crc kubenswrapper[4904]: I1205 20:16:00.940789 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.020260 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.056490 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.130898 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.233359 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.347443 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.437770 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.590136 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:16:01 crc kubenswrapper[4904]: I1205 20:16:01.901234 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.091714 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.153974 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.192188 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.227777 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.330853 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.461620 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.579902 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.636127 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.655135 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.702093 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.713222 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.773706 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.785397 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.816842 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.898535 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:16:02 crc kubenswrapper[4904]: I1205 20:16:02.960026 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.047256 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.325375 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.409701 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.418202 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.612941 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.789986 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:16:03 crc kubenswrapper[4904]: I1205 20:16:03.893311 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.077769 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.115214 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.471770 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.494121 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.627002 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.639844 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:16:04 crc kubenswrapper[4904]: I1205 20:16:04.912868 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.138945 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.441416 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.527364 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.535229 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.551529 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.634696 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.784902 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.843022 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:16:05 crc kubenswrapper[4904]: I1205 20:16:05.947737 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.123764 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.181554 4904 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.300949 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.342737 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.421957 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.441149 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:16:06 crc kubenswrapper[4904]: I1205 20:16:06.540540 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:16:07 crc kubenswrapper[4904]: I1205 20:16:07.712986 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:16:07 crc kubenswrapper[4904]: I1205 20:16:07.799013 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:16:07 crc kubenswrapper[4904]: I1205 20:16:07.898270 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:16:08 crc kubenswrapper[4904]: I1205 20:16:08.025891 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:16:08 crc kubenswrapper[4904]: I1205 20:16:08.470811 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:16:08 crc kubenswrapper[4904]: I1205 20:16:08.536170 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:16:08 crc kubenswrapper[4904]: I1205 20:16:08.536919 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:16:08 crc kubenswrapper[4904]: I1205 20:16:08.605241 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:16:08 crc kubenswrapper[4904]: I1205 20:16:08.634118 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:16:09 crc kubenswrapper[4904]: I1205 20:16:09.219082 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:16:09 crc kubenswrapper[4904]: I1205 20:16:09.270910 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:16:09 crc kubenswrapper[4904]: I1205 20:16:09.490559 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:16:09 crc kubenswrapper[4904]: I1205 20:16:09.493717 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:16:09 crc kubenswrapper[4904]: I1205 20:16:09.684515 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.026166 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.056046 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.112142 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.208246 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.370730 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.374927 4904 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.374985 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.375029 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.375707 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c5bde1e94a05f4bc776696c78cd1c1011b3ced1d5330e1be63db03a11d60bbf3"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.375810 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c5bde1e94a05f4bc776696c78cd1c1011b3ced1d5330e1be63db03a11d60bbf3" gracePeriod=30 Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.565736 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.599621 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.629702 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.714903 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:16:10 crc kubenswrapper[4904]: I1205 20:16:10.925620 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.049912 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.062212 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.083005 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.194678 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.340511 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.421748 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.547164 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.692801 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.804874 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.842252 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.847493 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:16:11 crc kubenswrapper[4904]: I1205 20:16:11.912178 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.008658 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.008896 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.057745 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.124655 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.152951 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.277445 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.311322 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.525149 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.571860 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.583093 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.667689 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.682126 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.780564 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.838089 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.905389 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:16:12 crc kubenswrapper[4904]: I1205 20:16:12.926237 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.057338 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.093300 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.292359 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.405196 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.453489 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.599250 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.629586 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.636683 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.671997 4904 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.798235 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.841909 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:16:13 crc kubenswrapper[4904]: I1205 20:16:13.886360 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.016094 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.194232 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.345701 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.394837 4904 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.423005 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.475460 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.606934 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.616155 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.664134 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.674195 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.742691 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.823742 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.833733 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.922108 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.935227 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.942820 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:16:14 crc kubenswrapper[4904]: I1205 20:16:14.999801 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.075438 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.313559 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.390813 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.430931 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.469866 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.551072 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.586605 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.606039 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.615902 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.616010 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.669821 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.698452 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.721988 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.892275 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.941927 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:16:15 crc kubenswrapper[4904]: I1205 20:16:15.942256 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.019365 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.151172 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.189380 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.237946 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.237946 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.346088 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.430254 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.455099 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.587822 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.734672 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.860629 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.894568 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.906870 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.952132 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.968647 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:16:16 crc kubenswrapper[4904]: I1205 20:16:16.998972 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.038671 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.520516 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.521533 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.743256 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.818887 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.831134 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:16:17 crc kubenswrapper[4904]: I1205 20:16:17.994140 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.099462 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.129981 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.153905 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.169948 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.176029 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.240177 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.371082 4904 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.476748 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.519160 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.588700 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.640233 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.715891 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.736460 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.778790 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.788030 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.789414 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.942240 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:16:18 crc kubenswrapper[4904]: I1205 20:16:18.951975 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.134946 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.155859 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.289999 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.338700 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.473477 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.520738 4904 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.550466 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.606890 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.714228 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:16:19 crc kubenswrapper[4904]: I1205 20:16:19.901892 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.013026 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.071307 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.330011 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.485671 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.495356 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.558210 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.843608 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.856533 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.858163 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:16:20 crc kubenswrapper[4904]: I1205 20:16:20.961343 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:16:21 crc kubenswrapper[4904]: I1205 20:16:21.061698 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:16:21 crc kubenswrapper[4904]: I1205 20:16:21.518722 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:16:21 crc kubenswrapper[4904]: I1205 20:16:21.631196 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:16:22 crc kubenswrapper[4904]: I1205 20:16:22.331774 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:16:22 crc kubenswrapper[4904]: I1205 20:16:22.632534 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:16:22 crc kubenswrapper[4904]: I1205 20:16:22.665131 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:16:22 crc kubenswrapper[4904]: I1205 20:16:22.897170 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.122319 4904 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.128509 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.128484796 podStartE2EDuration="47.128484796s" podCreationTimestamp="2025-12-05 20:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:54.226707862 +0000 UTC m=+253.037923991" watchObservedRunningTime="2025-12-05 20:16:23.128484796 +0000 UTC m=+281.939700945" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.129809 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.129872 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-9d745f8b5-w7d9g"] Dec 05 20:16:23 crc kubenswrapper[4904]: E1205 20:16:23.130673 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" containerName="installer" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.130720 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" containerName="installer" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.130868 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="01612d5b-cc85-4376-a8a4-c1fbe2ccabd9" containerName="installer" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.131117 4904 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.131153 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="519e4f29-5762-4a86-9f31-eae681598fd0" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.132212 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.135137 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.137090 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.139533 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.140440 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.140582 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.140601 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.140877 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.141291 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.141464 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.141673 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.141991 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.144939 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.145273 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.151487 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.155135 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.157073 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.191414 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=29.191393525 podStartE2EDuration="29.191393525s" podCreationTimestamp="2025-12-05 20:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:16:23.169443572 +0000 UTC m=+281.980659701" watchObservedRunningTime="2025-12-05 20:16:23.191393525 +0000 UTC m=+282.002609634" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209706 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209761 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209797 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-audit-policies\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209849 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkkdz\" (UniqueName: \"kubernetes.io/projected/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-kube-api-access-kkkdz\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209897 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209920 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209945 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.209974 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-audit-dir\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.210093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.210157 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-error\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.210192 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-login\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.210225 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-session\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.210249 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.210265 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.311812 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-audit-dir\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.311908 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.311955 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-error\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.311991 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-login\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.311990 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-audit-dir\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312032 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-session\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312107 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312148 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312358 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312417 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312469 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-audit-policies\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312515 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkkdz\" (UniqueName: \"kubernetes.io/projected/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-kube-api-access-kkkdz\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312565 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312604 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.312649 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.313793 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-audit-policies\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.314728 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.315534 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.316377 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.319243 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.320046 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-error\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.320318 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.320565 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.320923 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.321413 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-session\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.321417 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.323479 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-v4-0-config-user-template-login\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.332680 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkkdz\" (UniqueName: \"kubernetes.io/projected/4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9-kube-api-access-kkkdz\") pod \"oauth-openshift-9d745f8b5-w7d9g\" (UID: \"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.459634 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.831197 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.859157 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:16:23 crc kubenswrapper[4904]: I1205 20:16:23.872128 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9d745f8b5-w7d9g"] Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.117673 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.331713 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.396551 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" event={"ID":"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9","Type":"ContainerStarted","Data":"4e10ebc61f9d91784c35e6bc9bf8c6b0215237d436ad1783958ec41183cf9fc3"} Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.396619 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" event={"ID":"4706ef23-a6b5-4558-9e38-0bcf3bc0b0b9","Type":"ContainerStarted","Data":"1b9d0b59e90bdfc05dca5377c2a1a97ae6fddda75c414a0af288f82c4303a57b"} Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.396962 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.429313 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" podStartSLOduration=85.429285029 podStartE2EDuration="1m25.429285029s" podCreationTimestamp="2025-12-05 20:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:16:24.424890828 +0000 UTC m=+283.236107017" watchObservedRunningTime="2025-12-05 20:16:24.429285029 +0000 UTC m=+283.240501168" Dec 05 20:16:24 crc kubenswrapper[4904]: I1205 20:16:24.539477 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9d745f8b5-w7d9g" Dec 05 20:16:28 crc kubenswrapper[4904]: I1205 20:16:28.005471 4904 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:16:28 crc kubenswrapper[4904]: I1205 20:16:28.006039 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f9a9ab07f8c32a7927a4def344ba7542179305cde70f06a0316d4edd3f91620b" gracePeriod=5 Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.454968 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.455242 4904 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f9a9ab07f8c32a7927a4def344ba7542179305cde70f06a0316d4edd3f91620b" exitCode=137 Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.573278 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.573647 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646043 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646123 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646192 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646213 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646265 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646335 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646339 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646340 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646374 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646734 4904 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646759 4904 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646768 4904 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.646776 4904 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.656401 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.689840 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.690190 4904 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.699796 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.699825 4904 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1c160352-df96-48d9-b3d4-ab07fa464e09" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.712298 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.712344 4904 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1c160352-df96-48d9-b3d4-ab07fa464e09" Dec 05 20:16:33 crc kubenswrapper[4904]: I1205 20:16:33.749323 4904 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4904]: I1205 20:16:34.462800 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:16:34 crc kubenswrapper[4904]: I1205 20:16:34.462866 4904 scope.go:117] "RemoveContainer" containerID="f9a9ab07f8c32a7927a4def344ba7542179305cde70f06a0316d4edd3f91620b" Dec 05 20:16:34 crc kubenswrapper[4904]: I1205 20:16:34.462952 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:16:37 crc kubenswrapper[4904]: I1205 20:16:37.485371 4904 generic.go:334] "Generic (PLEG): container finished" podID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerID="8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b" exitCode=0 Dec 05 20:16:37 crc kubenswrapper[4904]: I1205 20:16:37.485451 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" event={"ID":"6aa0efa0-bd09-4388-b42c-11550e28712e","Type":"ContainerDied","Data":"8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b"} Dec 05 20:16:37 crc kubenswrapper[4904]: I1205 20:16:37.486560 4904 scope.go:117] "RemoveContainer" containerID="8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b" Dec 05 20:16:38 crc kubenswrapper[4904]: I1205 20:16:38.494662 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" event={"ID":"6aa0efa0-bd09-4388-b42c-11550e28712e","Type":"ContainerStarted","Data":"1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde"} Dec 05 20:16:38 crc kubenswrapper[4904]: I1205 20:16:38.495816 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:16:38 crc kubenswrapper[4904]: I1205 20:16:38.497433 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:16:40 crc kubenswrapper[4904]: I1205 20:16:40.509250 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 20:16:40 crc kubenswrapper[4904]: I1205 20:16:40.512846 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:16:40 crc kubenswrapper[4904]: I1205 20:16:40.512908 4904 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c5bde1e94a05f4bc776696c78cd1c1011b3ced1d5330e1be63db03a11d60bbf3" exitCode=137 Dec 05 20:16:40 crc kubenswrapper[4904]: I1205 20:16:40.512984 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c5bde1e94a05f4bc776696c78cd1c1011b3ced1d5330e1be63db03a11d60bbf3"} Dec 05 20:16:40 crc kubenswrapper[4904]: I1205 20:16:40.513031 4904 scope.go:117] "RemoveContainer" containerID="08fb5cbfee6c3da72b94b14ed486036e65308444c961b583e691cc89e604a11a" Dec 05 20:16:41 crc kubenswrapper[4904]: I1205 20:16:41.519124 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 20:16:41 crc kubenswrapper[4904]: I1205 20:16:41.521333 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74eb14f0ba18245cc34a822c8accdc1ef8713932598bf991fd5ca26664709ebb"} Dec 05 20:16:50 crc kubenswrapper[4904]: I1205 20:16:50.374766 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:16:50 crc kubenswrapper[4904]: I1205 20:16:50.381592 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:16:50 crc kubenswrapper[4904]: I1205 20:16:50.569946 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:16:50 crc kubenswrapper[4904]: I1205 20:16:50.574215 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.119007 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k9sjb"] Dec 05 20:17:10 crc kubenswrapper[4904]: E1205 20:17:10.119893 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.119905 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.119997 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.120387 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.131889 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k9sjb"] Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217154 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-registry-tls\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217234 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ba29d6-90e6-42af-a396-96a68a2d1b31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217284 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ba29d6-90e6-42af-a396-96a68a2d1b31-trusted-ca\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217311 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-bound-sa-token\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217415 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217462 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ba29d6-90e6-42af-a396-96a68a2d1b31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217477 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ba29d6-90e6-42af-a396-96a68a2d1b31-registry-certificates\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.217655 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclnx\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-kube-api-access-pclnx\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.238434 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318534 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ba29d6-90e6-42af-a396-96a68a2d1b31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318595 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ba29d6-90e6-42af-a396-96a68a2d1b31-registry-certificates\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318676 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pclnx\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-kube-api-access-pclnx\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318708 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-registry-tls\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318742 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ba29d6-90e6-42af-a396-96a68a2d1b31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318782 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ba29d6-90e6-42af-a396-96a68a2d1b31-trusted-ca\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.318856 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-bound-sa-token\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.319153 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ba29d6-90e6-42af-a396-96a68a2d1b31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.320003 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ba29d6-90e6-42af-a396-96a68a2d1b31-registry-certificates\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.320804 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ba29d6-90e6-42af-a396-96a68a2d1b31-trusted-ca\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.324862 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ba29d6-90e6-42af-a396-96a68a2d1b31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.331720 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-registry-tls\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.339012 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-bound-sa-token\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.350402 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclnx\" (UniqueName: \"kubernetes.io/projected/e2ba29d6-90e6-42af-a396-96a68a2d1b31-kube-api-access-pclnx\") pod \"image-registry-66df7c8f76-k9sjb\" (UID: \"e2ba29d6-90e6-42af-a396-96a68a2d1b31\") " pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.438460 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:10 crc kubenswrapper[4904]: I1205 20:17:10.855012 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k9sjb"] Dec 05 20:17:11 crc kubenswrapper[4904]: I1205 20:17:11.707394 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" event={"ID":"e2ba29d6-90e6-42af-a396-96a68a2d1b31","Type":"ContainerStarted","Data":"3c21c199e061e6301eb8bba8fbcaca5239d8efea5808891b6e1cebd7934b97b6"} Dec 05 20:17:11 crc kubenswrapper[4904]: I1205 20:17:11.707737 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" event={"ID":"e2ba29d6-90e6-42af-a396-96a68a2d1b31","Type":"ContainerStarted","Data":"4d0ab80bea20de3e74a69e4241ae6786df8d020397daea6ceff61f7bb942798a"} Dec 05 20:17:11 crc kubenswrapper[4904]: I1205 20:17:11.708736 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:11 crc kubenswrapper[4904]: I1205 20:17:11.731413 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" podStartSLOduration=1.731388613 podStartE2EDuration="1.731388613s" podCreationTimestamp="2025-12-05 20:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:17:11.726554529 +0000 UTC m=+330.537770718" watchObservedRunningTime="2025-12-05 20:17:11.731388613 +0000 UTC m=+330.542604732" Dec 05 20:17:30 crc kubenswrapper[4904]: I1205 20:17:30.446185 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-k9sjb" Dec 05 20:17:30 crc kubenswrapper[4904]: I1205 20:17:30.512087 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hdk86"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.884133 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vjj47"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.890119 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vjj47" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="registry-server" containerID="cri-o://3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031" gracePeriod=30 Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.902463 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpt4n"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.909363 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gpt4n" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="registry-server" containerID="cri-o://12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658" gracePeriod=30 Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.912235 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5srbq"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.912573 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" containerID="cri-o://1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde" gracePeriod=30 Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.916012 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd9gt"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.916383 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vd9gt" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="registry-server" containerID="cri-o://187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9" gracePeriod=30 Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.931367 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wzh2"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.932456 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.947689 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrg9c"] Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.951635 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jrg9c" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="registry-server" containerID="cri-o://f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59" gracePeriod=30 Dec 05 20:17:43 crc kubenswrapper[4904]: I1205 20:17:43.959659 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wzh2"] Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.090435 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k244r\" (UniqueName: \"kubernetes.io/projected/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-kube-api-access-k244r\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.090483 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.090548 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.191889 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.191955 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k244r\" (UniqueName: \"kubernetes.io/projected/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-kube-api-access-k244r\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.191979 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.193365 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.201900 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.212897 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k244r\" (UniqueName: \"kubernetes.io/projected/425f2b0f-3e5c-4db8-95f2-e0ae6581a443-kube-api-access-k244r\") pod \"marketplace-operator-79b997595-7wzh2\" (UID: \"425f2b0f-3e5c-4db8-95f2-e0ae6581a443\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.303145 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.727261 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wzh2"] Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.761626 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.841962 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.850271 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.869790 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.880994 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.914539 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-utilities\") pod \"59e498ac-1ef4-4984-956a-f1d7614d5dba\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.914638 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-catalog-content\") pod \"59e498ac-1ef4-4984-956a-f1d7614d5dba\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.914675 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9z6g\" (UniqueName: \"kubernetes.io/projected/59e498ac-1ef4-4984-956a-f1d7614d5dba-kube-api-access-p9z6g\") pod \"59e498ac-1ef4-4984-956a-f1d7614d5dba\" (UID: \"59e498ac-1ef4-4984-956a-f1d7614d5dba\") " Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.916992 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-utilities" (OuterVolumeSpecName: "utilities") pod "59e498ac-1ef4-4984-956a-f1d7614d5dba" (UID: "59e498ac-1ef4-4984-956a-f1d7614d5dba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.929998 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e498ac-1ef4-4984-956a-f1d7614d5dba-kube-api-access-p9z6g" (OuterVolumeSpecName: "kube-api-access-p9z6g") pod "59e498ac-1ef4-4984-956a-f1d7614d5dba" (UID: "59e498ac-1ef4-4984-956a-f1d7614d5dba"). InnerVolumeSpecName "kube-api-access-p9z6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.945970 4904 generic.go:334] "Generic (PLEG): container finished" podID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerID="3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031" exitCode=0 Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.946083 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerDied","Data":"3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.946127 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjj47" event={"ID":"9cff387b-9124-46db-8d06-5cb5839e0a12","Type":"ContainerDied","Data":"761b868471a5292b238d3864d425c1886e65246338f19b02771ecfeee8ffb7a5"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.946147 4904 scope.go:117] "RemoveContainer" containerID="3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.946303 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjj47" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.948671 4904 generic.go:334] "Generic (PLEG): container finished" podID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerID="1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde" exitCode=0 Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.948774 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.948717 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" event={"ID":"6aa0efa0-bd09-4388-b42c-11550e28712e","Type":"ContainerDied","Data":"1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.948852 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5srbq" event={"ID":"6aa0efa0-bd09-4388-b42c-11550e28712e","Type":"ContainerDied","Data":"d0f777344c89dfe481cdd7016f8f2826e1366d7ea172b722ccbd9170fe01063d"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.955003 4904 generic.go:334] "Generic (PLEG): container finished" podID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerID="12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658" exitCode=0 Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.955086 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpt4n" event={"ID":"d5752678-2a85-49a8-b6d4-63b4adb96277","Type":"ContainerDied","Data":"12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.955112 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpt4n" event={"ID":"d5752678-2a85-49a8-b6d4-63b4adb96277","Type":"ContainerDied","Data":"9eabb38cb62231225654c17f80fdc3d5f0d0ccfbbab74c2f46ceea3a1e95bee7"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.955216 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpt4n" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.959866 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" event={"ID":"425f2b0f-3e5c-4db8-95f2-e0ae6581a443","Type":"ContainerStarted","Data":"8336deb97302adeaec4a3ff4b60cf8ec827b946150a8f265b995727ca0813845"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.959897 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" event={"ID":"425f2b0f-3e5c-4db8-95f2-e0ae6581a443","Type":"ContainerStarted","Data":"2d4854c83694e6a1335c0e6b0c7017055575f6e163c1df1d216ca55dd7122fc3"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.960396 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.969614 4904 scope.go:117] "RemoveContainer" containerID="fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.970908 4904 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7wzh2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.64:8080/healthz\": dial tcp 10.217.0.64:8080: connect: connection refused" start-of-body= Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.970960 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" podUID="425f2b0f-3e5c-4db8-95f2-e0ae6581a443" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.64:8080/healthz\": dial tcp 10.217.0.64:8080: connect: connection refused" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.980889 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" podStartSLOduration=1.98086605 podStartE2EDuration="1.98086605s" podCreationTimestamp="2025-12-05 20:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:17:44.980827739 +0000 UTC m=+363.792043858" watchObservedRunningTime="2025-12-05 20:17:44.98086605 +0000 UTC m=+363.792082169" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.984384 4904 generic.go:334] "Generic (PLEG): container finished" podID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerID="f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59" exitCode=0 Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.984466 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerDied","Data":"f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.984500 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrg9c" event={"ID":"59e498ac-1ef4-4984-956a-f1d7614d5dba","Type":"ContainerDied","Data":"c74ed644ca8a064934a2bd6b76abb866a8121827effc2f7f2d9c1a087d500723"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.984569 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrg9c" Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.994575 4904 generic.go:334] "Generic (PLEG): container finished" podID="757462be-80d7-44c2-a193-48f78ac5b80e" containerID="187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9" exitCode=0 Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.994626 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd9gt" event={"ID":"757462be-80d7-44c2-a193-48f78ac5b80e","Type":"ContainerDied","Data":"187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.994656 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd9gt" event={"ID":"757462be-80d7-44c2-a193-48f78ac5b80e","Type":"ContainerDied","Data":"bf05d4f7d0ccdc3c6a204bdf693d7526c24fd6929d81ee9728df586ffd030e1a"} Dec 05 20:17:44 crc kubenswrapper[4904]: I1205 20:17:44.994728 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd9gt" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.016311 4904 scope.go:117] "RemoveContainer" containerID="fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.016313 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-operator-metrics\") pod \"6aa0efa0-bd09-4388-b42c-11550e28712e\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.016516 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-utilities\") pod \"757462be-80d7-44c2-a193-48f78ac5b80e\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.016583 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-catalog-content\") pod \"9cff387b-9124-46db-8d06-5cb5839e0a12\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.016631 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-catalog-content\") pod \"757462be-80d7-44c2-a193-48f78ac5b80e\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.016696 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pwcg\" (UniqueName: \"kubernetes.io/projected/9cff387b-9124-46db-8d06-5cb5839e0a12-kube-api-access-2pwcg\") pod \"9cff387b-9124-46db-8d06-5cb5839e0a12\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017145 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-utilities\") pod \"d5752678-2a85-49a8-b6d4-63b4adb96277\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017236 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zpj\" (UniqueName: \"kubernetes.io/projected/d5752678-2a85-49a8-b6d4-63b4adb96277-kube-api-access-g8zpj\") pod \"d5752678-2a85-49a8-b6d4-63b4adb96277\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017269 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsvkd\" (UniqueName: \"kubernetes.io/projected/6aa0efa0-bd09-4388-b42c-11550e28712e-kube-api-access-vsvkd\") pod \"6aa0efa0-bd09-4388-b42c-11550e28712e\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017299 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnmw9\" (UniqueName: \"kubernetes.io/projected/757462be-80d7-44c2-a193-48f78ac5b80e-kube-api-access-wnmw9\") pod \"757462be-80d7-44c2-a193-48f78ac5b80e\" (UID: \"757462be-80d7-44c2-a193-48f78ac5b80e\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017325 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-utilities\") pod \"9cff387b-9124-46db-8d06-5cb5839e0a12\" (UID: \"9cff387b-9124-46db-8d06-5cb5839e0a12\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017363 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-catalog-content\") pod \"d5752678-2a85-49a8-b6d4-63b4adb96277\" (UID: \"d5752678-2a85-49a8-b6d4-63b4adb96277\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017385 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-trusted-ca\") pod \"6aa0efa0-bd09-4388-b42c-11550e28712e\" (UID: \"6aa0efa0-bd09-4388-b42c-11550e28712e\") " Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.017609 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-utilities" (OuterVolumeSpecName: "utilities") pod "757462be-80d7-44c2-a193-48f78ac5b80e" (UID: "757462be-80d7-44c2-a193-48f78ac5b80e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.018159 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.018186 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9z6g\" (UniqueName: \"kubernetes.io/projected/59e498ac-1ef4-4984-956a-f1d7614d5dba-kube-api-access-p9z6g\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.018202 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.018309 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6aa0efa0-bd09-4388-b42c-11550e28712e" (UID: "6aa0efa0-bd09-4388-b42c-11550e28712e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.018863 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-utilities" (OuterVolumeSpecName: "utilities") pod "9cff387b-9124-46db-8d06-5cb5839e0a12" (UID: "9cff387b-9124-46db-8d06-5cb5839e0a12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.019510 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-utilities" (OuterVolumeSpecName: "utilities") pod "d5752678-2a85-49a8-b6d4-63b4adb96277" (UID: "d5752678-2a85-49a8-b6d4-63b4adb96277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.020366 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cff387b-9124-46db-8d06-5cb5839e0a12-kube-api-access-2pwcg" (OuterVolumeSpecName: "kube-api-access-2pwcg") pod "9cff387b-9124-46db-8d06-5cb5839e0a12" (UID: "9cff387b-9124-46db-8d06-5cb5839e0a12"). InnerVolumeSpecName "kube-api-access-2pwcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.022358 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5752678-2a85-49a8-b6d4-63b4adb96277-kube-api-access-g8zpj" (OuterVolumeSpecName: "kube-api-access-g8zpj") pod "d5752678-2a85-49a8-b6d4-63b4adb96277" (UID: "d5752678-2a85-49a8-b6d4-63b4adb96277"). InnerVolumeSpecName "kube-api-access-g8zpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.022656 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa0efa0-bd09-4388-b42c-11550e28712e-kube-api-access-vsvkd" (OuterVolumeSpecName: "kube-api-access-vsvkd") pod "6aa0efa0-bd09-4388-b42c-11550e28712e" (UID: "6aa0efa0-bd09-4388-b42c-11550e28712e"). InnerVolumeSpecName "kube-api-access-vsvkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.024804 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757462be-80d7-44c2-a193-48f78ac5b80e-kube-api-access-wnmw9" (OuterVolumeSpecName: "kube-api-access-wnmw9") pod "757462be-80d7-44c2-a193-48f78ac5b80e" (UID: "757462be-80d7-44c2-a193-48f78ac5b80e"). InnerVolumeSpecName "kube-api-access-wnmw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.031291 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6aa0efa0-bd09-4388-b42c-11550e28712e" (UID: "6aa0efa0-bd09-4388-b42c-11550e28712e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.059933 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "757462be-80d7-44c2-a193-48f78ac5b80e" (UID: "757462be-80d7-44c2-a193-48f78ac5b80e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.089212 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59e498ac-1ef4-4984-956a-f1d7614d5dba" (UID: "59e498ac-1ef4-4984-956a-f1d7614d5dba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.092745 4904 scope.go:117] "RemoveContainer" containerID="3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.093520 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031\": container with ID starting with 3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031 not found: ID does not exist" containerID="3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.093581 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031"} err="failed to get container status \"3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031\": rpc error: code = NotFound desc = could not find container \"3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031\": container with ID starting with 3857454a603debc7ed441d64d6ea69c0a82233e06ca0962436b9b55840218031 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.093633 4904 scope.go:117] "RemoveContainer" containerID="fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.095258 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37\": container with ID starting with fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37 not found: ID does not exist" containerID="fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.095315 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37"} err="failed to get container status \"fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37\": rpc error: code = NotFound desc = could not find container \"fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37\": container with ID starting with fa7b902a3f95d490bd72cc5c5a624342cd0c12293084bb3758bdb0fe8f222b37 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.095349 4904 scope.go:117] "RemoveContainer" containerID="fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.095662 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec\": container with ID starting with fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec not found: ID does not exist" containerID="fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.095689 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec"} err="failed to get container status \"fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec\": rpc error: code = NotFound desc = could not find container \"fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec\": container with ID starting with fed9a1323a8094d19e3e59a2e428566a305240b1eb24668ff20a54d0a02860ec not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.095703 4904 scope.go:117] "RemoveContainer" containerID="1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.115046 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cff387b-9124-46db-8d06-5cb5839e0a12" (UID: "9cff387b-9124-46db-8d06-5cb5839e0a12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.115932 4904 scope.go:117] "RemoveContainer" containerID="8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118668 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zpj\" (UniqueName: \"kubernetes.io/projected/d5752678-2a85-49a8-b6d4-63b4adb96277-kube-api-access-g8zpj\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118724 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsvkd\" (UniqueName: \"kubernetes.io/projected/6aa0efa0-bd09-4388-b42c-11550e28712e-kube-api-access-vsvkd\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118738 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnmw9\" (UniqueName: \"kubernetes.io/projected/757462be-80d7-44c2-a193-48f78ac5b80e-kube-api-access-wnmw9\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118752 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118763 4904 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118776 4904 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6aa0efa0-bd09-4388-b42c-11550e28712e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118789 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e498ac-1ef4-4984-956a-f1d7614d5dba-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118801 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cff387b-9124-46db-8d06-5cb5839e0a12-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118814 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757462be-80d7-44c2-a193-48f78ac5b80e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118826 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pwcg\" (UniqueName: \"kubernetes.io/projected/9cff387b-9124-46db-8d06-5cb5839e0a12-kube-api-access-2pwcg\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.118838 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.120453 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5752678-2a85-49a8-b6d4-63b4adb96277" (UID: "d5752678-2a85-49a8-b6d4-63b4adb96277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.133821 4904 scope.go:117] "RemoveContainer" containerID="1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.136301 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde\": container with ID starting with 1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde not found: ID does not exist" containerID="1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.136333 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde"} err="failed to get container status \"1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde\": rpc error: code = NotFound desc = could not find container \"1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde\": container with ID starting with 1c74bddebe6047618f860c473887d26acb21c3f134d3f98bb84140dd9169acde not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.136356 4904 scope.go:117] "RemoveContainer" containerID="8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.136931 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b\": container with ID starting with 8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b not found: ID does not exist" containerID="8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.136975 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b"} err="failed to get container status \"8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b\": rpc error: code = NotFound desc = could not find container \"8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b\": container with ID starting with 8a28c10c2ce3fae8fdf507eda181000a380eafffe5debb71086e40ddaeedfb0b not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.137013 4904 scope.go:117] "RemoveContainer" containerID="12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.151420 4904 scope.go:117] "RemoveContainer" containerID="8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.168660 4904 scope.go:117] "RemoveContainer" containerID="67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.189992 4904 scope.go:117] "RemoveContainer" containerID="12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.190570 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658\": container with ID starting with 12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658 not found: ID does not exist" containerID="12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.190610 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658"} err="failed to get container status \"12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658\": rpc error: code = NotFound desc = could not find container \"12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658\": container with ID starting with 12f899828ea0dba6e2cc29abd341150ca72d34bf99f980d1679501bb2886a658 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.190638 4904 scope.go:117] "RemoveContainer" containerID="8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.194389 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8\": container with ID starting with 8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8 not found: ID does not exist" containerID="8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.194448 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8"} err="failed to get container status \"8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8\": rpc error: code = NotFound desc = could not find container \"8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8\": container with ID starting with 8698adf4ac48650baf91fc4a63c4a667581bde2696ad6717185d30d7252cf2e8 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.194487 4904 scope.go:117] "RemoveContainer" containerID="67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.195274 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f\": container with ID starting with 67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f not found: ID does not exist" containerID="67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.195303 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f"} err="failed to get container status \"67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f\": rpc error: code = NotFound desc = could not find container \"67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f\": container with ID starting with 67c70e8bb9909c5560c46b3094e0e51c6e886610ea01ecb42ec95e2f2f17a81f not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.195546 4904 scope.go:117] "RemoveContainer" containerID="f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.212707 4904 scope.go:117] "RemoveContainer" containerID="8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.219636 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5752678-2a85-49a8-b6d4-63b4adb96277-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.234187 4904 scope.go:117] "RemoveContainer" containerID="b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.248157 4904 scope.go:117] "RemoveContainer" containerID="f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.248741 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59\": container with ID starting with f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59 not found: ID does not exist" containerID="f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.248791 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59"} err="failed to get container status \"f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59\": rpc error: code = NotFound desc = could not find container \"f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59\": container with ID starting with f36add3a4b076caaf31ac77eb1777155323138e98b1c9556f976fdac3b900e59 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.248827 4904 scope.go:117] "RemoveContainer" containerID="8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.249240 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f\": container with ID starting with 8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f not found: ID does not exist" containerID="8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.249271 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f"} err="failed to get container status \"8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f\": rpc error: code = NotFound desc = could not find container \"8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f\": container with ID starting with 8332572732ecea8e6f6d80aaf0125377b310c24871765e16a690097631a1586f not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.249296 4904 scope.go:117] "RemoveContainer" containerID="b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.249583 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd\": container with ID starting with b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd not found: ID does not exist" containerID="b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.249611 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd"} err="failed to get container status \"b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd\": rpc error: code = NotFound desc = could not find container \"b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd\": container with ID starting with b3f9a3f22636dcf8aee32418755d108a94d8ae2730e32beabfe1a484168679bd not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.249630 4904 scope.go:117] "RemoveContainer" containerID="187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.264907 4904 scope.go:117] "RemoveContainer" containerID="909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.295620 4904 scope.go:117] "RemoveContainer" containerID="532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.295817 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vjj47"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.305170 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vjj47"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.314088 4904 scope.go:117] "RemoveContainer" containerID="187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.314697 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9\": container with ID starting with 187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9 not found: ID does not exist" containerID="187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.314747 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9"} err="failed to get container status \"187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9\": rpc error: code = NotFound desc = could not find container \"187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9\": container with ID starting with 187ddb95b4aed8be8728e5b1505ceccd045b34a03add4a4b80305497694410e9 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.314782 4904 scope.go:117] "RemoveContainer" containerID="909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.315095 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e\": container with ID starting with 909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e not found: ID does not exist" containerID="909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.315114 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e"} err="failed to get container status \"909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e\": rpc error: code = NotFound desc = could not find container \"909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e\": container with ID starting with 909e6d29ffcf7dea47082b5e36524c01b3a9b2228469f62cff1c9bbdc08c289e not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.315126 4904 scope.go:117] "RemoveContainer" containerID="532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97" Dec 05 20:17:45 crc kubenswrapper[4904]: E1205 20:17:45.315526 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97\": container with ID starting with 532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97 not found: ID does not exist" containerID="532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.315546 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97"} err="failed to get container status \"532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97\": rpc error: code = NotFound desc = could not find container \"532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97\": container with ID starting with 532faad960c4d43db743bace3a04e015fb7889426644934d7c28ae1e46064d97 not found: ID does not exist" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.315580 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpt4n"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.323117 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gpt4n"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.332439 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5srbq"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.335400 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5srbq"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.340096 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrg9c"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.344172 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jrg9c"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.348490 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd9gt"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.352572 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd9gt"] Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.691097 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" path="/var/lib/kubelet/pods/59e498ac-1ef4-4984-956a-f1d7614d5dba/volumes" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.692967 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" path="/var/lib/kubelet/pods/6aa0efa0-bd09-4388-b42c-11550e28712e/volumes" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.694788 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" path="/var/lib/kubelet/pods/757462be-80d7-44c2-a193-48f78ac5b80e/volumes" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.696945 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" path="/var/lib/kubelet/pods/9cff387b-9124-46db-8d06-5cb5839e0a12/volumes" Dec 05 20:17:45 crc kubenswrapper[4904]: I1205 20:17:45.697835 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" path="/var/lib/kubelet/pods/d5752678-2a85-49a8-b6d4-63b4adb96277/volumes" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.025089 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7wzh2" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104238 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ffsws"] Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104417 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104428 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104439 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104445 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104451 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104457 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104465 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104471 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104482 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104488 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104496 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104501 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104507 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104512 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104520 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104525 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104532 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104537 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104546 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104551 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104560 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104566 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="extract-content" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104575 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104580 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104587 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104593 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="extract-utilities" Dec 05 20:17:46 crc kubenswrapper[4904]: E1205 20:17:46.104602 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104609 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104685 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cff387b-9124-46db-8d06-5cb5839e0a12" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104699 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="757462be-80d7-44c2-a193-48f78ac5b80e" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104706 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e498ac-1ef4-4984-956a-f1d7614d5dba" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104713 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104720 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa0efa0-bd09-4388-b42c-11550e28712e" containerName="marketplace-operator" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.104729 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5752678-2a85-49a8-b6d4-63b4adb96277" containerName="registry-server" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.105450 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.107986 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.119941 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffsws"] Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.233681 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-catalog-content\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.233836 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-utilities\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.233933 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rg8\" (UniqueName: \"kubernetes.io/projected/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-kube-api-access-s4rg8\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.302444 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-957zc"] Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.304015 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.308713 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.312274 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-957zc"] Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.334999 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rg8\" (UniqueName: \"kubernetes.io/projected/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-kube-api-access-s4rg8\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.335086 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-catalog-content\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.335160 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-utilities\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.336164 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-utilities\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.336280 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-catalog-content\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.356002 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rg8\" (UniqueName: \"kubernetes.io/projected/3124ecfc-d81f-468a-89e6-1b76cdf4e61e-kube-api-access-s4rg8\") pod \"redhat-marketplace-ffsws\" (UID: \"3124ecfc-d81f-468a-89e6-1b76cdf4e61e\") " pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.424281 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.436488 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1995f9a-4194-4318-b9ab-b30b6c01ac51-catalog-content\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.436532 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1995f9a-4194-4318-b9ab-b30b6c01ac51-utilities\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.436572 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5ghn\" (UniqueName: \"kubernetes.io/projected/f1995f9a-4194-4318-b9ab-b30b6c01ac51-kube-api-access-f5ghn\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.538221 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1995f9a-4194-4318-b9ab-b30b6c01ac51-utilities\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.538594 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5ghn\" (UniqueName: \"kubernetes.io/projected/f1995f9a-4194-4318-b9ab-b30b6c01ac51-kube-api-access-f5ghn\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.538666 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1995f9a-4194-4318-b9ab-b30b6c01ac51-catalog-content\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.538846 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1995f9a-4194-4318-b9ab-b30b6c01ac51-utilities\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.539099 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1995f9a-4194-4318-b9ab-b30b6c01ac51-catalog-content\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.556146 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5ghn\" (UniqueName: \"kubernetes.io/projected/f1995f9a-4194-4318-b9ab-b30b6c01ac51-kube-api-access-f5ghn\") pod \"redhat-operators-957zc\" (UID: \"f1995f9a-4194-4318-b9ab-b30b6c01ac51\") " pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.617867 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffsws"] Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.623312 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:46 crc kubenswrapper[4904]: I1205 20:17:46.814510 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-957zc"] Dec 05 20:17:46 crc kubenswrapper[4904]: W1205 20:17:46.853901 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1995f9a_4194_4318_b9ab_b30b6c01ac51.slice/crio-cda2c038616e9546d9570941c2e1352c826ed2a0c60461106e1243540eccb1a0 WatchSource:0}: Error finding container cda2c038616e9546d9570941c2e1352c826ed2a0c60461106e1243540eccb1a0: Status 404 returned error can't find the container with id cda2c038616e9546d9570941c2e1352c826ed2a0c60461106e1243540eccb1a0 Dec 05 20:17:47 crc kubenswrapper[4904]: I1205 20:17:47.026441 4904 generic.go:334] "Generic (PLEG): container finished" podID="f1995f9a-4194-4318-b9ab-b30b6c01ac51" containerID="3d1b0dca60fd4185d1ab9116f01eb4ec65bf7679f9001c919c6455c0797e1d9c" exitCode=0 Dec 05 20:17:47 crc kubenswrapper[4904]: I1205 20:17:47.026493 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-957zc" event={"ID":"f1995f9a-4194-4318-b9ab-b30b6c01ac51","Type":"ContainerDied","Data":"3d1b0dca60fd4185d1ab9116f01eb4ec65bf7679f9001c919c6455c0797e1d9c"} Dec 05 20:17:47 crc kubenswrapper[4904]: I1205 20:17:47.026552 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-957zc" event={"ID":"f1995f9a-4194-4318-b9ab-b30b6c01ac51","Type":"ContainerStarted","Data":"cda2c038616e9546d9570941c2e1352c826ed2a0c60461106e1243540eccb1a0"} Dec 05 20:17:47 crc kubenswrapper[4904]: I1205 20:17:47.028560 4904 generic.go:334] "Generic (PLEG): container finished" podID="3124ecfc-d81f-468a-89e6-1b76cdf4e61e" containerID="abc6683c7fe27fe5406777be4052a5e457e0c6cd3c215aba5d7f8f6cab63da81" exitCode=0 Dec 05 20:17:47 crc kubenswrapper[4904]: I1205 20:17:47.029665 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffsws" event={"ID":"3124ecfc-d81f-468a-89e6-1b76cdf4e61e","Type":"ContainerDied","Data":"abc6683c7fe27fe5406777be4052a5e457e0c6cd3c215aba5d7f8f6cab63da81"} Dec 05 20:17:47 crc kubenswrapper[4904]: I1205 20:17:47.029692 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffsws" event={"ID":"3124ecfc-d81f-468a-89e6-1b76cdf4e61e","Type":"ContainerStarted","Data":"5272a5b78f1fc447dc5e3c35de81520a8e31e8856cac7820e37a43cafb3166bc"} Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.034851 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffsws" event={"ID":"3124ecfc-d81f-468a-89e6-1b76cdf4e61e","Type":"ContainerStarted","Data":"ac974085ff2ad917b0c2c509c54883de52aa3455bccb4606149e707e603bb40d"} Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.039225 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-957zc" event={"ID":"f1995f9a-4194-4318-b9ab-b30b6c01ac51","Type":"ContainerStarted","Data":"cbaaacf9f1e695c23411345ab2a472cead26ebc5df0128f3ca68049fdf37e5ac"} Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.504541 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqt9x"] Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.506146 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.508195 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.513147 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqt9x"] Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.676793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzgs2\" (UniqueName: \"kubernetes.io/projected/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-kube-api-access-kzgs2\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.676855 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-utilities\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.676982 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-catalog-content\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.702459 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhhfn"] Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.703491 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.706718 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.710862 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhhfn"] Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.778687 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-catalog-content\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.778739 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzgs2\" (UniqueName: \"kubernetes.io/projected/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-kube-api-access-kzgs2\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.778784 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-utilities\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.779280 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-utilities\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.779305 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-catalog-content\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.805788 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzgs2\" (UniqueName: \"kubernetes.io/projected/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-kube-api-access-kzgs2\") pod \"certified-operators-rqt9x\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.838812 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.879595 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81188cfd-eef2-4e0b-b04a-fd189da456d2-catalog-content\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.879701 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9lx\" (UniqueName: \"kubernetes.io/projected/81188cfd-eef2-4e0b-b04a-fd189da456d2-kube-api-access-sb9lx\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.879752 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81188cfd-eef2-4e0b-b04a-fd189da456d2-utilities\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.981030 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81188cfd-eef2-4e0b-b04a-fd189da456d2-catalog-content\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.981374 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9lx\" (UniqueName: \"kubernetes.io/projected/81188cfd-eef2-4e0b-b04a-fd189da456d2-kube-api-access-sb9lx\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.981414 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81188cfd-eef2-4e0b-b04a-fd189da456d2-utilities\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.981603 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81188cfd-eef2-4e0b-b04a-fd189da456d2-catalog-content\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:48 crc kubenswrapper[4904]: I1205 20:17:48.981735 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81188cfd-eef2-4e0b-b04a-fd189da456d2-utilities\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.000021 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9lx\" (UniqueName: \"kubernetes.io/projected/81188cfd-eef2-4e0b-b04a-fd189da456d2-kube-api-access-sb9lx\") pod \"community-operators-fhhfn\" (UID: \"81188cfd-eef2-4e0b-b04a-fd189da456d2\") " pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.014948 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqt9x"] Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.020837 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.047220 4904 generic.go:334] "Generic (PLEG): container finished" podID="f1995f9a-4194-4318-b9ab-b30b6c01ac51" containerID="cbaaacf9f1e695c23411345ab2a472cead26ebc5df0128f3ca68049fdf37e5ac" exitCode=0 Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.047329 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-957zc" event={"ID":"f1995f9a-4194-4318-b9ab-b30b6c01ac51","Type":"ContainerDied","Data":"cbaaacf9f1e695c23411345ab2a472cead26ebc5df0128f3ca68049fdf37e5ac"} Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.048563 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqt9x" event={"ID":"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f","Type":"ContainerStarted","Data":"54b543d9da79a617f13a72e47319845f5956637d2eb90fd8f01b4fe8b7ca78c3"} Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.052426 4904 generic.go:334] "Generic (PLEG): container finished" podID="3124ecfc-d81f-468a-89e6-1b76cdf4e61e" containerID="ac974085ff2ad917b0c2c509c54883de52aa3455bccb4606149e707e603bb40d" exitCode=0 Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.052489 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffsws" event={"ID":"3124ecfc-d81f-468a-89e6-1b76cdf4e61e","Type":"ContainerDied","Data":"ac974085ff2ad917b0c2c509c54883de52aa3455bccb4606149e707e603bb40d"} Dec 05 20:17:49 crc kubenswrapper[4904]: I1205 20:17:49.231663 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhhfn"] Dec 05 20:17:49 crc kubenswrapper[4904]: W1205 20:17:49.236188 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81188cfd_eef2_4e0b_b04a_fd189da456d2.slice/crio-f6c84b3589429bf2a0c677c700bcb6ea850c5e88c69307712f96ca83379600dd WatchSource:0}: Error finding container f6c84b3589429bf2a0c677c700bcb6ea850c5e88c69307712f96ca83379600dd: Status 404 returned error can't find the container with id f6c84b3589429bf2a0c677c700bcb6ea850c5e88c69307712f96ca83379600dd Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.059882 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffsws" event={"ID":"3124ecfc-d81f-468a-89e6-1b76cdf4e61e","Type":"ContainerStarted","Data":"9de7ef966869cd464f4268dac329169a16b92f9d91372844364fc8ce8c34d7af"} Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.062568 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-957zc" event={"ID":"f1995f9a-4194-4318-b9ab-b30b6c01ac51","Type":"ContainerStarted","Data":"9d580293ba75360afa63b5ed25db35542a2b0277bffc4d5d61d770234a5b6ae0"} Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.064567 4904 generic.go:334] "Generic (PLEG): container finished" podID="81188cfd-eef2-4e0b-b04a-fd189da456d2" containerID="f7126acd14e2587bc4cbfd41ca0cb227790373073822ed045164d4d00ab30df4" exitCode=0 Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.064613 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhhfn" event={"ID":"81188cfd-eef2-4e0b-b04a-fd189da456d2","Type":"ContainerDied","Data":"f7126acd14e2587bc4cbfd41ca0cb227790373073822ed045164d4d00ab30df4"} Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.064629 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhhfn" event={"ID":"81188cfd-eef2-4e0b-b04a-fd189da456d2","Type":"ContainerStarted","Data":"f6c84b3589429bf2a0c677c700bcb6ea850c5e88c69307712f96ca83379600dd"} Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.066864 4904 generic.go:334] "Generic (PLEG): container finished" podID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerID="afdd49f1e531d44b338fab66b0dce97ee2deae7160fb65606c98aead9171d6f8" exitCode=0 Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.066896 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqt9x" event={"ID":"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f","Type":"ContainerDied","Data":"afdd49f1e531d44b338fab66b0dce97ee2deae7160fb65606c98aead9171d6f8"} Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.076186 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ffsws" podStartSLOduration=1.626725327 podStartE2EDuration="4.076156121s" podCreationTimestamp="2025-12-05 20:17:46 +0000 UTC" firstStartedPulling="2025-12-05 20:17:47.030371531 +0000 UTC m=+365.841587640" lastFinishedPulling="2025-12-05 20:17:49.479802325 +0000 UTC m=+368.291018434" observedRunningTime="2025-12-05 20:17:50.075191414 +0000 UTC m=+368.886407543" watchObservedRunningTime="2025-12-05 20:17:50.076156121 +0000 UTC m=+368.887372230" Dec 05 20:17:50 crc kubenswrapper[4904]: I1205 20:17:50.137763 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-957zc" podStartSLOduration=1.7497154190000002 podStartE2EDuration="4.137738045s" podCreationTimestamp="2025-12-05 20:17:46 +0000 UTC" firstStartedPulling="2025-12-05 20:17:47.027743618 +0000 UTC m=+365.838959727" lastFinishedPulling="2025-12-05 20:17:49.415766234 +0000 UTC m=+368.226982353" observedRunningTime="2025-12-05 20:17:50.133022634 +0000 UTC m=+368.944238743" watchObservedRunningTime="2025-12-05 20:17:50.137738045 +0000 UTC m=+368.948954154" Dec 05 20:17:51 crc kubenswrapper[4904]: I1205 20:17:51.084450 4904 generic.go:334] "Generic (PLEG): container finished" podID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerID="44a54eadb952c71b1b3f2cab8cbd8cd95f2c49b11d4e46810c3a82f0ae6707a0" exitCode=0 Dec 05 20:17:51 crc kubenswrapper[4904]: I1205 20:17:51.084615 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqt9x" event={"ID":"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f","Type":"ContainerDied","Data":"44a54eadb952c71b1b3f2cab8cbd8cd95f2c49b11d4e46810c3a82f0ae6707a0"} Dec 05 20:17:51 crc kubenswrapper[4904]: I1205 20:17:51.089872 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhhfn" event={"ID":"81188cfd-eef2-4e0b-b04a-fd189da456d2","Type":"ContainerStarted","Data":"6b3075e8111b1207ef59b3251a718450cbbba71ee988f00e8eb3478d5cf33947"} Dec 05 20:17:52 crc kubenswrapper[4904]: I1205 20:17:52.105381 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqt9x" event={"ID":"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f","Type":"ContainerStarted","Data":"5bd32f9bfa163e888187fe00aa1835fd8bdfeceb814bc551033da444839bd1a7"} Dec 05 20:17:52 crc kubenswrapper[4904]: I1205 20:17:52.108177 4904 generic.go:334] "Generic (PLEG): container finished" podID="81188cfd-eef2-4e0b-b04a-fd189da456d2" containerID="6b3075e8111b1207ef59b3251a718450cbbba71ee988f00e8eb3478d5cf33947" exitCode=0 Dec 05 20:17:52 crc kubenswrapper[4904]: I1205 20:17:52.108221 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhhfn" event={"ID":"81188cfd-eef2-4e0b-b04a-fd189da456d2","Type":"ContainerDied","Data":"6b3075e8111b1207ef59b3251a718450cbbba71ee988f00e8eb3478d5cf33947"} Dec 05 20:17:52 crc kubenswrapper[4904]: I1205 20:17:52.158594 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqt9x" podStartSLOduration=2.770726314 podStartE2EDuration="4.158574463s" podCreationTimestamp="2025-12-05 20:17:48 +0000 UTC" firstStartedPulling="2025-12-05 20:17:50.067997485 +0000 UTC m=+368.879213594" lastFinishedPulling="2025-12-05 20:17:51.455845634 +0000 UTC m=+370.267061743" observedRunningTime="2025-12-05 20:17:52.134138787 +0000 UTC m=+370.945354906" watchObservedRunningTime="2025-12-05 20:17:52.158574463 +0000 UTC m=+370.969790572" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.128199 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhhfn" event={"ID":"81188cfd-eef2-4e0b-b04a-fd189da456d2","Type":"ContainerStarted","Data":"bd4ba2c9675a0a93332237556882a481568f603b14f521636fef7a00dd714b0e"} Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.149030 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhhfn" podStartSLOduration=4.2809341960000005 podStartE2EDuration="7.149013721s" podCreationTimestamp="2025-12-05 20:17:48 +0000 UTC" firstStartedPulling="2025-12-05 20:17:50.066162774 +0000 UTC m=+368.877378883" lastFinishedPulling="2025-12-05 20:17:52.934242299 +0000 UTC m=+371.745458408" observedRunningTime="2025-12-05 20:17:55.146931674 +0000 UTC m=+373.958147793" watchObservedRunningTime="2025-12-05 20:17:55.149013721 +0000 UTC m=+373.960229830" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.546433 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" podUID="5b1664bf-b83a-4582-8018-ec55e02a4068" containerName="registry" containerID="cri-o://ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612" gracePeriod=30 Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.863676 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971266 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-certificates\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971322 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b1664bf-b83a-4582-8018-ec55e02a4068-ca-trust-extracted\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971492 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971635 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b1664bf-b83a-4582-8018-ec55e02a4068-installation-pull-secrets\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971672 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-trusted-ca\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971693 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-bound-sa-token\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971721 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wqc\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-kube-api-access-s7wqc\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.971756 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-tls\") pod \"5b1664bf-b83a-4582-8018-ec55e02a4068\" (UID: \"5b1664bf-b83a-4582-8018-ec55e02a4068\") " Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.972442 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.973085 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.978131 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.978355 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b1664bf-b83a-4582-8018-ec55e02a4068-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.978469 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.978889 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-kube-api-access-s7wqc" (OuterVolumeSpecName: "kube-api-access-s7wqc") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "kube-api-access-s7wqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.987368 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b1664bf-b83a-4582-8018-ec55e02a4068-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:17:55 crc kubenswrapper[4904]: I1205 20:17:55.988297 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5b1664bf-b83a-4582-8018-ec55e02a4068" (UID: "5b1664bf-b83a-4582-8018-ec55e02a4068"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073333 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wqc\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-kube-api-access-s7wqc\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073400 4904 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073416 4904 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073429 4904 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b1664bf-b83a-4582-8018-ec55e02a4068-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073439 4904 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b1664bf-b83a-4582-8018-ec55e02a4068-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073449 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b1664bf-b83a-4582-8018-ec55e02a4068-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.073458 4904 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b1664bf-b83a-4582-8018-ec55e02a4068-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.142657 4904 generic.go:334] "Generic (PLEG): container finished" podID="5b1664bf-b83a-4582-8018-ec55e02a4068" containerID="ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612" exitCode=0 Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.142739 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" event={"ID":"5b1664bf-b83a-4582-8018-ec55e02a4068","Type":"ContainerDied","Data":"ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612"} Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.142764 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.142784 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hdk86" event={"ID":"5b1664bf-b83a-4582-8018-ec55e02a4068","Type":"ContainerDied","Data":"36bf9e89dcf50cea05451a5e50f6b0416e5b74037ca069b26237b7aa023a5e7b"} Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.142801 4904 scope.go:117] "RemoveContainer" containerID="ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.161973 4904 scope.go:117] "RemoveContainer" containerID="ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612" Dec 05 20:17:56 crc kubenswrapper[4904]: E1205 20:17:56.162450 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612\": container with ID starting with ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612 not found: ID does not exist" containerID="ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.162474 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612"} err="failed to get container status \"ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612\": rpc error: code = NotFound desc = could not find container \"ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612\": container with ID starting with ef1306e7a14639af4066725bcb1ce4b29ac856b2f8eed4f6836c058ab1a31612 not found: ID does not exist" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.172092 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hdk86"] Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.175679 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hdk86"] Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.424420 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.424464 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.461627 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.624777 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.624823 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:56 crc kubenswrapper[4904]: I1205 20:17:56.679686 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:57 crc kubenswrapper[4904]: I1205 20:17:57.185457 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-957zc" Dec 05 20:17:57 crc kubenswrapper[4904]: I1205 20:17:57.185880 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ffsws" Dec 05 20:17:57 crc kubenswrapper[4904]: I1205 20:17:57.688154 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1664bf-b83a-4582-8018-ec55e02a4068" path="/var/lib/kubelet/pods/5b1664bf-b83a-4582-8018-ec55e02a4068/volumes" Dec 05 20:17:58 crc kubenswrapper[4904]: I1205 20:17:58.839177 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:58 crc kubenswrapper[4904]: I1205 20:17:58.839235 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:58 crc kubenswrapper[4904]: I1205 20:17:58.887883 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.022254 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.022303 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.076346 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.198613 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhhfn" Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.205622 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.956250 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:17:59 crc kubenswrapper[4904]: I1205 20:17:59.956329 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:18:05 crc kubenswrapper[4904]: E1205 20:18:05.952732 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 05 20:18:29 crc kubenswrapper[4904]: I1205 20:18:29.955506 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:18:29 crc kubenswrapper[4904]: I1205 20:18:29.956133 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:18:59 crc kubenswrapper[4904]: I1205 20:18:59.956122 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:18:59 crc kubenswrapper[4904]: I1205 20:18:59.956686 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:18:59 crc kubenswrapper[4904]: I1205 20:18:59.956735 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:18:59 crc kubenswrapper[4904]: I1205 20:18:59.957364 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"124e7b18b562ea9691adfe9edd2ebe0554085fb149cd56e77b617510a754d1be"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:18:59 crc kubenswrapper[4904]: I1205 20:18:59.957417 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://124e7b18b562ea9691adfe9edd2ebe0554085fb149cd56e77b617510a754d1be" gracePeriod=600 Dec 05 20:19:00 crc kubenswrapper[4904]: I1205 20:19:00.520732 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="124e7b18b562ea9691adfe9edd2ebe0554085fb149cd56e77b617510a754d1be" exitCode=0 Dec 05 20:19:00 crc kubenswrapper[4904]: I1205 20:19:00.520793 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"124e7b18b562ea9691adfe9edd2ebe0554085fb149cd56e77b617510a754d1be"} Dec 05 20:19:00 crc kubenswrapper[4904]: I1205 20:19:00.521137 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"4acc0855bb68cb8c598e7e35234dd87f785f666d2488ee58dffc84313152e984"} Dec 05 20:19:00 crc kubenswrapper[4904]: I1205 20:19:00.521180 4904 scope.go:117] "RemoveContainer" containerID="08368507f1a5c9ee1b93c9a2b111939cbe36a99f87758d0de78b146309871438" Dec 05 20:21:29 crc kubenswrapper[4904]: I1205 20:21:29.955487 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:21:29 crc kubenswrapper[4904]: I1205 20:21:29.956091 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:21:59 crc kubenswrapper[4904]: I1205 20:21:59.956706 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:21:59 crc kubenswrapper[4904]: I1205 20:21:59.957334 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:22:29 crc kubenswrapper[4904]: I1205 20:22:29.955975 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:22:29 crc kubenswrapper[4904]: I1205 20:22:29.956535 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:22:29 crc kubenswrapper[4904]: I1205 20:22:29.956584 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:22:29 crc kubenswrapper[4904]: I1205 20:22:29.957119 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4acc0855bb68cb8c598e7e35234dd87f785f666d2488ee58dffc84313152e984"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:22:29 crc kubenswrapper[4904]: I1205 20:22:29.957163 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://4acc0855bb68cb8c598e7e35234dd87f785f666d2488ee58dffc84313152e984" gracePeriod=600 Dec 05 20:22:30 crc kubenswrapper[4904]: I1205 20:22:30.844789 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="4acc0855bb68cb8c598e7e35234dd87f785f666d2488ee58dffc84313152e984" exitCode=0 Dec 05 20:22:30 crc kubenswrapper[4904]: I1205 20:22:30.844835 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"4acc0855bb68cb8c598e7e35234dd87f785f666d2488ee58dffc84313152e984"} Dec 05 20:22:30 crc kubenswrapper[4904]: I1205 20:22:30.845241 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"8714f5a82d22d92591b2585f5d959b3d2c6d45de1c2f0d4263e0b6b75477d050"} Dec 05 20:22:30 crc kubenswrapper[4904]: I1205 20:22:30.845285 4904 scope.go:117] "RemoveContainer" containerID="124e7b18b562ea9691adfe9edd2ebe0554085fb149cd56e77b617510a754d1be" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.653196 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tfxc5"] Dec 05 20:23:45 crc kubenswrapper[4904]: E1205 20:23:45.653967 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1664bf-b83a-4582-8018-ec55e02a4068" containerName="registry" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.653983 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1664bf-b83a-4582-8018-ec55e02a4068" containerName="registry" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.654133 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1664bf-b83a-4582-8018-ec55e02a4068" containerName="registry" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.654479 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.656866 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.660552 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.660605 4904 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hmrvc" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.662030 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-5gj92"] Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.662810 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-5gj92" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.667360 4904 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8tpcs" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.675166 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tfxc5"] Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.715512 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2d8zg"] Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.716367 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.718473 4904 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-p62cm" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.720954 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-5gj92"] Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.729413 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2d8zg"] Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.759466 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk89m\" (UniqueName: \"kubernetes.io/projected/45ace5ad-d0b7-469f-b03c-62e935ba67dd-kube-api-access-qk89m\") pod \"cert-manager-cainjector-7f985d654d-tfxc5\" (UID: \"45ace5ad-d0b7-469f-b03c-62e935ba67dd\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.759727 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvf98\" (UniqueName: \"kubernetes.io/projected/c33ef1b8-c426-4845-ace3-476e7e7c842e-kube-api-access-qvf98\") pod \"cert-manager-5b446d88c5-5gj92\" (UID: \"c33ef1b8-c426-4845-ace3-476e7e7c842e\") " pod="cert-manager/cert-manager-5b446d88c5-5gj92" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.860695 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swn5\" (UniqueName: \"kubernetes.io/projected/cdc9c672-2a99-47b6-8457-fd6ec79db49b-kube-api-access-9swn5\") pod \"cert-manager-webhook-5655c58dd6-2d8zg\" (UID: \"cdc9c672-2a99-47b6-8457-fd6ec79db49b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.860770 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvf98\" (UniqueName: \"kubernetes.io/projected/c33ef1b8-c426-4845-ace3-476e7e7c842e-kube-api-access-qvf98\") pod \"cert-manager-5b446d88c5-5gj92\" (UID: \"c33ef1b8-c426-4845-ace3-476e7e7c842e\") " pod="cert-manager/cert-manager-5b446d88c5-5gj92" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.860805 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk89m\" (UniqueName: \"kubernetes.io/projected/45ace5ad-d0b7-469f-b03c-62e935ba67dd-kube-api-access-qk89m\") pod \"cert-manager-cainjector-7f985d654d-tfxc5\" (UID: \"45ace5ad-d0b7-469f-b03c-62e935ba67dd\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.882444 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk89m\" (UniqueName: \"kubernetes.io/projected/45ace5ad-d0b7-469f-b03c-62e935ba67dd-kube-api-access-qk89m\") pod \"cert-manager-cainjector-7f985d654d-tfxc5\" (UID: \"45ace5ad-d0b7-469f-b03c-62e935ba67dd\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.883021 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvf98\" (UniqueName: \"kubernetes.io/projected/c33ef1b8-c426-4845-ace3-476e7e7c842e-kube-api-access-qvf98\") pod \"cert-manager-5b446d88c5-5gj92\" (UID: \"c33ef1b8-c426-4845-ace3-476e7e7c842e\") " pod="cert-manager/cert-manager-5b446d88c5-5gj92" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.962018 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9swn5\" (UniqueName: \"kubernetes.io/projected/cdc9c672-2a99-47b6-8457-fd6ec79db49b-kube-api-access-9swn5\") pod \"cert-manager-webhook-5655c58dd6-2d8zg\" (UID: \"cdc9c672-2a99-47b6-8457-fd6ec79db49b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.969689 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" Dec 05 20:23:45 crc kubenswrapper[4904]: I1205 20:23:45.984049 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swn5\" (UniqueName: \"kubernetes.io/projected/cdc9c672-2a99-47b6-8457-fd6ec79db49b-kube-api-access-9swn5\") pod \"cert-manager-webhook-5655c58dd6-2d8zg\" (UID: \"cdc9c672-2a99-47b6-8457-fd6ec79db49b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.004654 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-5gj92" Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.032497 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.236589 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tfxc5"] Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.243973 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.317857 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" event={"ID":"45ace5ad-d0b7-469f-b03c-62e935ba67dd","Type":"ContainerStarted","Data":"a6418dcbd0f932dbaa9a96a129326690e30e7ae6d4aa9dc0f4d5d107122aded4"} Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.497164 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-5gj92"] Dec 05 20:23:46 crc kubenswrapper[4904]: W1205 20:23:46.506387 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc33ef1b8_c426_4845_ace3_476e7e7c842e.slice/crio-2b3373bcd472be64c34a4856d344e2b30ce9e4cf9fc1ec0ad886570f3ed4640d WatchSource:0}: Error finding container 2b3373bcd472be64c34a4856d344e2b30ce9e4cf9fc1ec0ad886570f3ed4640d: Status 404 returned error can't find the container with id 2b3373bcd472be64c34a4856d344e2b30ce9e4cf9fc1ec0ad886570f3ed4640d Dec 05 20:23:46 crc kubenswrapper[4904]: I1205 20:23:46.506428 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2d8zg"] Dec 05 20:23:46 crc kubenswrapper[4904]: W1205 20:23:46.507562 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdc9c672_2a99_47b6_8457_fd6ec79db49b.slice/crio-46ea0beadb457bf3b9a64eb07fcd47957974cb0a7ff7580f9247e84518e135e8 WatchSource:0}: Error finding container 46ea0beadb457bf3b9a64eb07fcd47957974cb0a7ff7580f9247e84518e135e8: Status 404 returned error can't find the container with id 46ea0beadb457bf3b9a64eb07fcd47957974cb0a7ff7580f9247e84518e135e8 Dec 05 20:23:47 crc kubenswrapper[4904]: I1205 20:23:47.323610 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" event={"ID":"cdc9c672-2a99-47b6-8457-fd6ec79db49b","Type":"ContainerStarted","Data":"46ea0beadb457bf3b9a64eb07fcd47957974cb0a7ff7580f9247e84518e135e8"} Dec 05 20:23:47 crc kubenswrapper[4904]: I1205 20:23:47.324485 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-5gj92" event={"ID":"c33ef1b8-c426-4845-ace3-476e7e7c842e","Type":"ContainerStarted","Data":"2b3373bcd472be64c34a4856d344e2b30ce9e4cf9fc1ec0ad886570f3ed4640d"} Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.347177 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-5gj92" event={"ID":"c33ef1b8-c426-4845-ace3-476e7e7c842e","Type":"ContainerStarted","Data":"5ccd6ad98035f318813b53c3a883f11ac1bbe7582e2973837ec4527894ddc4c5"} Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.348814 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" event={"ID":"45ace5ad-d0b7-469f-b03c-62e935ba67dd","Type":"ContainerStarted","Data":"b62ff0ed9f6502586fa21a0dfdfc71d14764f2df30a992447d1a49233b7b81d2"} Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.350547 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" event={"ID":"cdc9c672-2a99-47b6-8457-fd6ec79db49b","Type":"ContainerStarted","Data":"11eaef7a46c3110fea5bf8a08fe4b4411be68852d8674e23c5e49edc7a8147e9"} Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.350701 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.370254 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-5gj92" podStartSLOduration=2.582175878 podStartE2EDuration="5.370228465s" podCreationTimestamp="2025-12-05 20:23:45 +0000 UTC" firstStartedPulling="2025-12-05 20:23:46.509177561 +0000 UTC m=+725.320393670" lastFinishedPulling="2025-12-05 20:23:49.297230118 +0000 UTC m=+728.108446257" observedRunningTime="2025-12-05 20:23:50.364113843 +0000 UTC m=+729.175329972" watchObservedRunningTime="2025-12-05 20:23:50.370228465 +0000 UTC m=+729.181444594" Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.389921 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" podStartSLOduration=2.546037428 podStartE2EDuration="5.389905141s" podCreationTimestamp="2025-12-05 20:23:45 +0000 UTC" firstStartedPulling="2025-12-05 20:23:46.51088795 +0000 UTC m=+725.322104059" lastFinishedPulling="2025-12-05 20:23:49.354755643 +0000 UTC m=+728.165971772" observedRunningTime="2025-12-05 20:23:50.388345057 +0000 UTC m=+729.199561206" watchObservedRunningTime="2025-12-05 20:23:50.389905141 +0000 UTC m=+729.201121250" Dec 05 20:23:50 crc kubenswrapper[4904]: I1205 20:23:50.405223 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfxc5" podStartSLOduration=2.351642291 podStartE2EDuration="5.405205673s" podCreationTimestamp="2025-12-05 20:23:45 +0000 UTC" firstStartedPulling="2025-12-05 20:23:46.243568094 +0000 UTC m=+725.054784203" lastFinishedPulling="2025-12-05 20:23:49.297131476 +0000 UTC m=+728.108347585" observedRunningTime="2025-12-05 20:23:50.402913998 +0000 UTC m=+729.214130117" watchObservedRunningTime="2025-12-05 20:23:50.405205673 +0000 UTC m=+729.216421782" Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.648231 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dsvd6"] Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649637 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-controller" containerID="cri-o://2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649714 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649770 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-acl-logging" containerID="cri-o://a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649646 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="nbdb" containerID="cri-o://42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649693 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-node" containerID="cri-o://d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649666 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="sbdb" containerID="cri-o://3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.649679 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="northd" containerID="cri-o://a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.683770 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" containerID="cri-o://d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" gracePeriod=30 Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.943868 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/3.log" Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.946079 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovn-acl-logging/0.log" Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.946544 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovn-controller/0.log" Dec 05 20:23:55 crc kubenswrapper[4904]: I1205 20:23:55.946926 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.005975 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2xrv7"] Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.006394 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="northd" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.006460 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="northd" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.006512 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.006563 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.006613 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.006667 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.006719 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-acl-logging" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.006768 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-acl-logging" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.006822 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-node" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.006872 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-node" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.006929 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="nbdb" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.006981 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="nbdb" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.007028 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007095 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.007157 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007207 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.007259 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007316 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.007370 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kubecfg-setup" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007415 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kubecfg-setup" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.007462 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="sbdb" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007512 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="sbdb" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007639 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007695 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="northd" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007748 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="nbdb" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.007799 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-acl-logging" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008189 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovn-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008247 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-node" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008306 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008355 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008406 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="sbdb" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008463 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008535 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.008730 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008810 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.008883 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.008955 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.009156 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" containerName="ovnkube-controller" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.010750 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.034977 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2d8zg" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089329 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089400 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-openvswitch\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089455 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-ovn-kubernetes\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089477 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-systemd\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089495 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089513 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-script-lib\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089580 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-netns\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089600 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-kubelet\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089616 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-var-lib-openvswitch\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089642 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-env-overrides\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089663 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-bin\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089684 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d7kb\" (UniqueName: \"kubernetes.io/projected/55fbdf03-712c-4abc-9847-225fe63052e3-kube-api-access-2d7kb\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089708 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-slash\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089729 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55fbdf03-712c-4abc-9847-225fe63052e3-ovn-node-metrics-cert\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089747 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-log-socket\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089760 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-ovn\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089779 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-etc-openvswitch\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089800 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-node-log\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089831 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-systemd-units\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089865 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-netd\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089883 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-config\") pod \"55fbdf03-712c-4abc-9847-225fe63052e3\" (UID: \"55fbdf03-712c-4abc-9847-225fe63052e3\") " Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089944 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.089988 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090046 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-kubelet\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090105 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-slash\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090148 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3941c0c3-6198-42bb-86a9-129aad436e63-ovn-node-metrics-cert\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090188 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-cni-bin\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090209 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-systemd\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090224 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-log-socket\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090251 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-cni-netd\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090266 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-systemd-units\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090283 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl6qb\" (UniqueName: \"kubernetes.io/projected/3941c0c3-6198-42bb-86a9-129aad436e63-kube-api-access-cl6qb\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090301 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-ovn\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090318 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090342 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-ovnkube-script-lib\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090375 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-node-log\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090400 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-ovnkube-config\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090429 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-var-lib-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090470 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-env-overrides\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090493 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-etc-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090513 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090543 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-run-netns\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090581 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090622 4904 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090636 4904 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090648 4904 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090688 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090752 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090768 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090771 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090815 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-node-log" (OuterVolumeSpecName: "node-log") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090833 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.090850 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091131 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091156 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091254 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091518 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091549 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-slash" (OuterVolumeSpecName: "host-slash") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091570 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-log-socket" (OuterVolumeSpecName: "log-socket") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.091591 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.096210 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fbdf03-712c-4abc-9847-225fe63052e3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.096423 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fbdf03-712c-4abc-9847-225fe63052e3-kube-api-access-2d7kb" (OuterVolumeSpecName: "kube-api-access-2d7kb") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "kube-api-access-2d7kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.122309 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "55fbdf03-712c-4abc-9847-225fe63052e3" (UID: "55fbdf03-712c-4abc-9847-225fe63052e3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192240 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192299 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-kubelet\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192327 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-slash\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192343 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192408 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-kubelet\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192356 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3941c0c3-6198-42bb-86a9-129aad436e63-ovn-node-metrics-cert\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192934 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-slash\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.192981 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-cni-bin\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193011 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-systemd\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193031 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-log-socket\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193099 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-cni-netd\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193113 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-cni-bin\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193131 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-systemd\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193118 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-systemd-units\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193163 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-cni-netd\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193163 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-log-socket\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193140 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-systemd-units\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193200 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl6qb\" (UniqueName: \"kubernetes.io/projected/3941c0c3-6198-42bb-86a9-129aad436e63-kube-api-access-cl6qb\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193231 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193261 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-ovn\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193292 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-ovnkube-script-lib\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193336 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-node-log\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193340 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193372 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-ovnkube-config\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193334 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-run-ovn\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193467 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-var-lib-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193489 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-node-log\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193511 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-env-overrides\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193589 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-var-lib-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193595 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-etc-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193628 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193671 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-run-netns\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193728 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193808 4904 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193823 4904 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193838 4904 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193844 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-host-run-netns\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193852 4904 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193887 4904 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193901 4904 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193915 4904 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193927 4904 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55fbdf03-712c-4abc-9847-225fe63052e3-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193939 4904 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193953 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d7kb\" (UniqueName: \"kubernetes.io/projected/55fbdf03-712c-4abc-9847-225fe63052e3-kube-api-access-2d7kb\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193964 4904 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193976 4904 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55fbdf03-712c-4abc-9847-225fe63052e3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193987 4904 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.193998 4904 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194007 4904 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194017 4904 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194027 4904 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55fbdf03-712c-4abc-9847-225fe63052e3-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194166 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-ovnkube-script-lib\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194218 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-env-overrides\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194259 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3941c0c3-6198-42bb-86a9-129aad436e63-etc-openvswitch\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.194410 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3941c0c3-6198-42bb-86a9-129aad436e63-ovnkube-config\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.197265 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3941c0c3-6198-42bb-86a9-129aad436e63-ovn-node-metrics-cert\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.216190 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl6qb\" (UniqueName: \"kubernetes.io/projected/3941c0c3-6198-42bb-86a9-129aad436e63-kube-api-access-cl6qb\") pod \"ovnkube-node-2xrv7\" (UID: \"3941c0c3-6198-42bb-86a9-129aad436e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.331483 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.397697 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/2.log" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.398651 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/1.log" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.398754 4904 generic.go:334] "Generic (PLEG): container finished" podID="5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea" containerID="a724f454d77b9b67af0b65f96e13eb70ffb479606ba3d7aa571916c68b1e2f03" exitCode=2 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.398923 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerDied","Data":"a724f454d77b9b67af0b65f96e13eb70ffb479606ba3d7aa571916c68b1e2f03"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.398996 4904 scope.go:117] "RemoveContainer" containerID="7010629bd83e0fb435b59dd47808f31c7694e158d77048ed22a54668a8b2712d" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.399563 4904 scope.go:117] "RemoveContainer" containerID="a724f454d77b9b67af0b65f96e13eb70ffb479606ba3d7aa571916c68b1e2f03" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.402319 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"b0887a039091ad08ff0ef74d369ab5501bed3259a604b7cd1f3a6a5d826d53aa"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.405167 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovnkube-controller/3.log" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.409980 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovn-acl-logging/0.log" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.412209 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsvd6_55fbdf03-712c-4abc-9847-225fe63052e3/ovn-controller/0.log" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.412964 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" exitCode=0 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413001 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" exitCode=0 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413014 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" exitCode=0 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413005 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413109 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413158 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413185 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413024 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" exitCode=0 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413199 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413233 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" exitCode=0 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413243 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413263 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413278 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413319 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413330 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413337 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413344 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413350 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413358 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413365 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413398 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413406 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413246 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" exitCode=0 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413432 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" exitCode=143 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413447 4904 generic.go:334] "Generic (PLEG): container finished" podID="55fbdf03-712c-4abc-9847-225fe63052e3" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" exitCode=143 Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413465 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413479 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413486 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413493 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413500 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413509 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413517 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413523 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413532 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413539 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413546 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413555 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413566 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413575 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413582 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413588 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413594 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413601 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413607 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413613 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413619 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413626 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413637 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsvd6" event={"ID":"55fbdf03-712c-4abc-9847-225fe63052e3","Type":"ContainerDied","Data":"3c9a83d11af8ca2c49877b8fd19d374d3069afd010660d8308d1f3279747c330"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413647 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413655 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413662 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413670 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413677 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413685 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413692 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413699 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413705 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.413712 4904 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.451422 4904 scope.go:117] "RemoveContainer" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.477735 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.477978 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dsvd6"] Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.482512 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dsvd6"] Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.505697 4904 scope.go:117] "RemoveContainer" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.566038 4904 scope.go:117] "RemoveContainer" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.581823 4904 scope.go:117] "RemoveContainer" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.596034 4904 scope.go:117] "RemoveContainer" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.613845 4904 scope.go:117] "RemoveContainer" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.644244 4904 scope.go:117] "RemoveContainer" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.661192 4904 scope.go:117] "RemoveContainer" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.682649 4904 scope.go:117] "RemoveContainer" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.712945 4904 scope.go:117] "RemoveContainer" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.713597 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": container with ID starting with d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85 not found: ID does not exist" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.713635 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} err="failed to get container status \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": rpc error: code = NotFound desc = could not find container \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": container with ID starting with d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.713661 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.713928 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": container with ID starting with 0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4 not found: ID does not exist" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.713976 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} err="failed to get container status \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": rpc error: code = NotFound desc = could not find container \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": container with ID starting with 0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.714008 4904 scope.go:117] "RemoveContainer" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.714409 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": container with ID starting with 3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6 not found: ID does not exist" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.714445 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} err="failed to get container status \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": rpc error: code = NotFound desc = could not find container \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": container with ID starting with 3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.714472 4904 scope.go:117] "RemoveContainer" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.714744 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": container with ID starting with 42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93 not found: ID does not exist" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.714771 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} err="failed to get container status \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": rpc error: code = NotFound desc = could not find container \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": container with ID starting with 42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.714792 4904 scope.go:117] "RemoveContainer" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.715004 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": container with ID starting with a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02 not found: ID does not exist" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715026 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} err="failed to get container status \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": rpc error: code = NotFound desc = could not find container \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": container with ID starting with a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715041 4904 scope.go:117] "RemoveContainer" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.715274 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": container with ID starting with d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51 not found: ID does not exist" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715301 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} err="failed to get container status \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": rpc error: code = NotFound desc = could not find container \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": container with ID starting with d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715320 4904 scope.go:117] "RemoveContainer" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.715543 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": container with ID starting with d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf not found: ID does not exist" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715569 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} err="failed to get container status \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": rpc error: code = NotFound desc = could not find container \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": container with ID starting with d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715588 4904 scope.go:117] "RemoveContainer" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.715880 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": container with ID starting with a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0 not found: ID does not exist" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715909 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} err="failed to get container status \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": rpc error: code = NotFound desc = could not find container \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": container with ID starting with a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.715929 4904 scope.go:117] "RemoveContainer" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.716270 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": container with ID starting with 2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e not found: ID does not exist" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.716300 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} err="failed to get container status \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": rpc error: code = NotFound desc = could not find container \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": container with ID starting with 2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.716318 4904 scope.go:117] "RemoveContainer" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" Dec 05 20:23:56 crc kubenswrapper[4904]: E1205 20:23:56.716523 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": container with ID starting with f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d not found: ID does not exist" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.716544 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} err="failed to get container status \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": rpc error: code = NotFound desc = could not find container \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": container with ID starting with f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.716561 4904 scope.go:117] "RemoveContainer" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.716807 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} err="failed to get container status \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": rpc error: code = NotFound desc = could not find container \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": container with ID starting with d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.716834 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717121 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} err="failed to get container status \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": rpc error: code = NotFound desc = could not find container \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": container with ID starting with 0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717148 4904 scope.go:117] "RemoveContainer" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717378 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} err="failed to get container status \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": rpc error: code = NotFound desc = could not find container \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": container with ID starting with 3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717398 4904 scope.go:117] "RemoveContainer" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717629 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} err="failed to get container status \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": rpc error: code = NotFound desc = could not find container \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": container with ID starting with 42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717651 4904 scope.go:117] "RemoveContainer" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717886 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} err="failed to get container status \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": rpc error: code = NotFound desc = could not find container \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": container with ID starting with a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.717911 4904 scope.go:117] "RemoveContainer" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.718181 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} err="failed to get container status \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": rpc error: code = NotFound desc = could not find container \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": container with ID starting with d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.718208 4904 scope.go:117] "RemoveContainer" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.718437 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} err="failed to get container status \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": rpc error: code = NotFound desc = could not find container \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": container with ID starting with d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.718456 4904 scope.go:117] "RemoveContainer" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.718743 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} err="failed to get container status \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": rpc error: code = NotFound desc = could not find container \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": container with ID starting with a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.718766 4904 scope.go:117] "RemoveContainer" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719118 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} err="failed to get container status \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": rpc error: code = NotFound desc = could not find container \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": container with ID starting with 2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719142 4904 scope.go:117] "RemoveContainer" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719399 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} err="failed to get container status \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": rpc error: code = NotFound desc = could not find container \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": container with ID starting with f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719423 4904 scope.go:117] "RemoveContainer" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719675 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} err="failed to get container status \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": rpc error: code = NotFound desc = could not find container \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": container with ID starting with d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719697 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719936 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} err="failed to get container status \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": rpc error: code = NotFound desc = could not find container \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": container with ID starting with 0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.719958 4904 scope.go:117] "RemoveContainer" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.720323 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} err="failed to get container status \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": rpc error: code = NotFound desc = could not find container \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": container with ID starting with 3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.720361 4904 scope.go:117] "RemoveContainer" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.720703 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} err="failed to get container status \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": rpc error: code = NotFound desc = could not find container \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": container with ID starting with 42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.720722 4904 scope.go:117] "RemoveContainer" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.720993 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} err="failed to get container status \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": rpc error: code = NotFound desc = could not find container \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": container with ID starting with a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721017 4904 scope.go:117] "RemoveContainer" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721316 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} err="failed to get container status \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": rpc error: code = NotFound desc = could not find container \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": container with ID starting with d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721335 4904 scope.go:117] "RemoveContainer" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721507 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} err="failed to get container status \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": rpc error: code = NotFound desc = could not find container \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": container with ID starting with d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721525 4904 scope.go:117] "RemoveContainer" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721775 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} err="failed to get container status \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": rpc error: code = NotFound desc = could not find container \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": container with ID starting with a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.721794 4904 scope.go:117] "RemoveContainer" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.722013 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} err="failed to get container status \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": rpc error: code = NotFound desc = could not find container \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": container with ID starting with 2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.722038 4904 scope.go:117] "RemoveContainer" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.722286 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} err="failed to get container status \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": rpc error: code = NotFound desc = could not find container \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": container with ID starting with f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.722310 4904 scope.go:117] "RemoveContainer" containerID="d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.722923 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85"} err="failed to get container status \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": rpc error: code = NotFound desc = could not find container \"d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85\": container with ID starting with d8695f6e6d6a985b8d6141ad8b8c6bf4378c06581847d74a6e10b310cfd13f85 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.722962 4904 scope.go:117] "RemoveContainer" containerID="0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.723246 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4"} err="failed to get container status \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": rpc error: code = NotFound desc = could not find container \"0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4\": container with ID starting with 0056a74367d0e3cfa5e2d8715316dcb9d0f1fe4f0a9be8f8c29671005bba59d4 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.723264 4904 scope.go:117] "RemoveContainer" containerID="3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.723515 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6"} err="failed to get container status \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": rpc error: code = NotFound desc = could not find container \"3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6\": container with ID starting with 3f9faaf7692d5f538051487e3f1cbd098843d9b3cf94a85f07a89f83290ecbb6 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.723536 4904 scope.go:117] "RemoveContainer" containerID="42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.723784 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93"} err="failed to get container status \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": rpc error: code = NotFound desc = could not find container \"42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93\": container with ID starting with 42621955a68115e4beef3521988384c4cceec83af473c391c4b692f85d724f93 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.723803 4904 scope.go:117] "RemoveContainer" containerID="a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724018 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02"} err="failed to get container status \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": rpc error: code = NotFound desc = could not find container \"a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02\": container with ID starting with a82f03047c6e82430b06290af2d46bec1b9d46a009a08b798bda094e763f9e02 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724046 4904 scope.go:117] "RemoveContainer" containerID="d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724366 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51"} err="failed to get container status \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": rpc error: code = NotFound desc = could not find container \"d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51\": container with ID starting with d34fab567578ac09675c9b354bb8548808f2cdf810a06611fa1c99f5b0a49d51 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724392 4904 scope.go:117] "RemoveContainer" containerID="d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724619 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf"} err="failed to get container status \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": rpc error: code = NotFound desc = could not find container \"d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf\": container with ID starting with d29c15e00dac8096e418bfdd6a5e8baa0df9a7ca3bb03ff345a30606f92aecbf not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724642 4904 scope.go:117] "RemoveContainer" containerID="a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724948 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0"} err="failed to get container status \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": rpc error: code = NotFound desc = could not find container \"a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0\": container with ID starting with a7d00b763f4fd74fdc82e1875fe4a67de39a87f5b6299c97a2cc27e5f22e18f0 not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.724972 4904 scope.go:117] "RemoveContainer" containerID="2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.725278 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e"} err="failed to get container status \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": rpc error: code = NotFound desc = could not find container \"2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e\": container with ID starting with 2d5420f7a9423ca4736258f5284f47d750e8908fa0cac31d0df35c3ff9d59c3e not found: ID does not exist" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.725295 4904 scope.go:117] "RemoveContainer" containerID="f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d" Dec 05 20:23:56 crc kubenswrapper[4904]: I1205 20:23:56.725519 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d"} err="failed to get container status \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": rpc error: code = NotFound desc = could not find container \"f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d\": container with ID starting with f8c1c31b2212dcd6799e8ecafd084824ca549db878e5eb83b5e2ef036d03998d not found: ID does not exist" Dec 05 20:23:57 crc kubenswrapper[4904]: I1205 20:23:57.432085 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gfzvv_5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea/kube-multus/2.log" Dec 05 20:23:57 crc kubenswrapper[4904]: I1205 20:23:57.432454 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gfzvv" event={"ID":"5fcfb250-f7e5-4ae1-9c49-43a68e8de9ea","Type":"ContainerStarted","Data":"3a7d211ba53eae53c1bfa761e3f6c931ce41c1f76751ab4b45dd19be5863f5e2"} Dec 05 20:23:57 crc kubenswrapper[4904]: I1205 20:23:57.434415 4904 generic.go:334] "Generic (PLEG): container finished" podID="3941c0c3-6198-42bb-86a9-129aad436e63" containerID="cf54f39eae3e0318de4f8216c8119fc890bbde918fef53dcda5c148de975c8f7" exitCode=0 Dec 05 20:23:57 crc kubenswrapper[4904]: I1205 20:23:57.434467 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerDied","Data":"cf54f39eae3e0318de4f8216c8119fc890bbde918fef53dcda5c148de975c8f7"} Dec 05 20:23:57 crc kubenswrapper[4904]: I1205 20:23:57.699579 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fbdf03-712c-4abc-9847-225fe63052e3" path="/var/lib/kubelet/pods/55fbdf03-712c-4abc-9847-225fe63052e3/volumes" Dec 05 20:23:58 crc kubenswrapper[4904]: I1205 20:23:58.446174 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"6cbd4bd3d9a8a1ca2aea3720d68a564f6dee2b351936f7936f6a504c5f792536"} Dec 05 20:23:58 crc kubenswrapper[4904]: I1205 20:23:58.446487 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"f84c3ffa738cab6e44d2715cc7a070d3ec566c9d5b130a23c21e1a6fc40c8931"} Dec 05 20:23:58 crc kubenswrapper[4904]: I1205 20:23:58.446507 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"4b54c8d263cc97fda6d81aa545c682d0b2aecb04c2b1d7553c31501de6a8e54f"} Dec 05 20:23:58 crc kubenswrapper[4904]: I1205 20:23:58.446521 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"50d2b1af4d2e2402b34b0297e0f2ab219117e886064e4c05f78f321298233da8"} Dec 05 20:23:58 crc kubenswrapper[4904]: I1205 20:23:58.446537 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"ac0d4acd3c74a8c906abdbdeab9efb265728e584edfc4e335a2031640a998558"} Dec 05 20:23:58 crc kubenswrapper[4904]: I1205 20:23:58.446547 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"c45e03fd40620faa9201146e53ee59cc68b010da7d2b45045779dae19a535235"} Dec 05 20:24:01 crc kubenswrapper[4904]: I1205 20:24:01.475652 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"6279ab2bc86a7a7b4e5a95aff0f115bbe3b3726068e6afc18f1fa1d568f85f5d"} Dec 05 20:24:03 crc kubenswrapper[4904]: I1205 20:24:03.496149 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" event={"ID":"3941c0c3-6198-42bb-86a9-129aad436e63","Type":"ContainerStarted","Data":"4d3d451f0fc2c7ab6ae2d6130fb9b5a0e0b641e59badb80b7a246966f744d52b"} Dec 05 20:24:04 crc kubenswrapper[4904]: I1205 20:24:04.502253 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:24:04 crc kubenswrapper[4904]: I1205 20:24:04.502386 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:24:04 crc kubenswrapper[4904]: I1205 20:24:04.502473 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:24:04 crc kubenswrapper[4904]: I1205 20:24:04.536839 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" podStartSLOduration=9.536820968 podStartE2EDuration="9.536820968s" podCreationTimestamp="2025-12-05 20:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:04.534717299 +0000 UTC m=+743.345933438" watchObservedRunningTime="2025-12-05 20:24:04.536820968 +0000 UTC m=+743.348037077" Dec 05 20:24:04 crc kubenswrapper[4904]: I1205 20:24:04.545046 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:24:04 crc kubenswrapper[4904]: I1205 20:24:04.551523 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:24:14 crc kubenswrapper[4904]: I1205 20:24:14.004046 4904 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.862334 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d"] Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.864532 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.867023 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.895308 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d"] Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.977321 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.977380 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:25 crc kubenswrapper[4904]: I1205 20:24:25.977502 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xp8\" (UniqueName: \"kubernetes.io/projected/5b90a283-7232-45a4-9326-e96a19d446fa-kube-api-access-77xp8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.078708 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.078782 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.078823 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77xp8\" (UniqueName: \"kubernetes.io/projected/5b90a283-7232-45a4-9326-e96a19d446fa-kube-api-access-77xp8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.079600 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.079638 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.098475 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xp8\" (UniqueName: \"kubernetes.io/projected/5b90a283-7232-45a4-9326-e96a19d446fa-kube-api-access-77xp8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.190335 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.370844 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2xrv7" Dec 05 20:24:26 crc kubenswrapper[4904]: I1205 20:24:26.698999 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d"] Dec 05 20:24:27 crc kubenswrapper[4904]: I1205 20:24:27.634780 4904 generic.go:334] "Generic (PLEG): container finished" podID="5b90a283-7232-45a4-9326-e96a19d446fa" containerID="1224b778a2ea7bdb9ede188f010497858fd9f54155f117073eac91d6274f3340" exitCode=0 Dec 05 20:24:27 crc kubenswrapper[4904]: I1205 20:24:27.634831 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" event={"ID":"5b90a283-7232-45a4-9326-e96a19d446fa","Type":"ContainerDied","Data":"1224b778a2ea7bdb9ede188f010497858fd9f54155f117073eac91d6274f3340"} Dec 05 20:24:27 crc kubenswrapper[4904]: I1205 20:24:27.634865 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" event={"ID":"5b90a283-7232-45a4-9326-e96a19d446fa","Type":"ContainerStarted","Data":"5488f8b19c8f4cdadda3ab74a495729c6fc97887507c6ed2d38800616ca86f43"} Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.197375 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvhpt"] Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.200260 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.210570 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvhpt"] Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.309815 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-catalog-content\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.309908 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-utilities\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.309953 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7np45\" (UniqueName: \"kubernetes.io/projected/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-kube-api-access-7np45\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.410563 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7np45\" (UniqueName: \"kubernetes.io/projected/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-kube-api-access-7np45\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.410609 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-catalog-content\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.410661 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-utilities\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.411264 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-utilities\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.411699 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-catalog-content\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.438169 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7np45\" (UniqueName: \"kubernetes.io/projected/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-kube-api-access-7np45\") pod \"redhat-operators-zvhpt\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.522295 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:28 crc kubenswrapper[4904]: I1205 20:24:28.697887 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvhpt"] Dec 05 20:24:29 crc kubenswrapper[4904]: I1205 20:24:29.652450 4904 generic.go:334] "Generic (PLEG): container finished" podID="5b90a283-7232-45a4-9326-e96a19d446fa" containerID="a324944edd89b0a9d5e149a6f5e65b0060fb0c2aa135aa0e855298372b0dea7a" exitCode=0 Dec 05 20:24:29 crc kubenswrapper[4904]: I1205 20:24:29.652545 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" event={"ID":"5b90a283-7232-45a4-9326-e96a19d446fa","Type":"ContainerDied","Data":"a324944edd89b0a9d5e149a6f5e65b0060fb0c2aa135aa0e855298372b0dea7a"} Dec 05 20:24:29 crc kubenswrapper[4904]: I1205 20:24:29.655157 4904 generic.go:334] "Generic (PLEG): container finished" podID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerID="bc9bd875a441d78bcb87aaf91163f4da9cdd295baa31787d22de29a9e3d8b8f9" exitCode=0 Dec 05 20:24:29 crc kubenswrapper[4904]: I1205 20:24:29.655200 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerDied","Data":"bc9bd875a441d78bcb87aaf91163f4da9cdd295baa31787d22de29a9e3d8b8f9"} Dec 05 20:24:29 crc kubenswrapper[4904]: I1205 20:24:29.655235 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerStarted","Data":"5a55c3a96dc6585f17025165884d46107a44d7957bb393654f9cbd5b2b9cd4f4"} Dec 05 20:24:30 crc kubenswrapper[4904]: I1205 20:24:30.664631 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerStarted","Data":"b9a288b480ffb634f8e49d8f051fba4a38375ae29039fecad17ce49258182e6b"} Dec 05 20:24:30 crc kubenswrapper[4904]: I1205 20:24:30.667427 4904 generic.go:334] "Generic (PLEG): container finished" podID="5b90a283-7232-45a4-9326-e96a19d446fa" containerID="67185c1848a68e6d10c1b00d0f0bcd844f446af335bbd38924d016a40706899f" exitCode=0 Dec 05 20:24:30 crc kubenswrapper[4904]: I1205 20:24:30.667479 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" event={"ID":"5b90a283-7232-45a4-9326-e96a19d446fa","Type":"ContainerDied","Data":"67185c1848a68e6d10c1b00d0f0bcd844f446af335bbd38924d016a40706899f"} Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.183204 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.262302 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-util\") pod \"5b90a283-7232-45a4-9326-e96a19d446fa\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.262377 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77xp8\" (UniqueName: \"kubernetes.io/projected/5b90a283-7232-45a4-9326-e96a19d446fa-kube-api-access-77xp8\") pod \"5b90a283-7232-45a4-9326-e96a19d446fa\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.262471 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-bundle\") pod \"5b90a283-7232-45a4-9326-e96a19d446fa\" (UID: \"5b90a283-7232-45a4-9326-e96a19d446fa\") " Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.264950 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-bundle" (OuterVolumeSpecName: "bundle") pod "5b90a283-7232-45a4-9326-e96a19d446fa" (UID: "5b90a283-7232-45a4-9326-e96a19d446fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.268935 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b90a283-7232-45a4-9326-e96a19d446fa-kube-api-access-77xp8" (OuterVolumeSpecName: "kube-api-access-77xp8") pod "5b90a283-7232-45a4-9326-e96a19d446fa" (UID: "5b90a283-7232-45a4-9326-e96a19d446fa"). InnerVolumeSpecName "kube-api-access-77xp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.298751 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-util" (OuterVolumeSpecName: "util") pod "5b90a283-7232-45a4-9326-e96a19d446fa" (UID: "5b90a283-7232-45a4-9326-e96a19d446fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.363744 4904 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.363788 4904 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b90a283-7232-45a4-9326-e96a19d446fa-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.363805 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77xp8\" (UniqueName: \"kubernetes.io/projected/5b90a283-7232-45a4-9326-e96a19d446fa-kube-api-access-77xp8\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.684901 4904 generic.go:334] "Generic (PLEG): container finished" podID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerID="b9a288b480ffb634f8e49d8f051fba4a38375ae29039fecad17ce49258182e6b" exitCode=0 Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.684995 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerDied","Data":"b9a288b480ffb634f8e49d8f051fba4a38375ae29039fecad17ce49258182e6b"} Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.689778 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" event={"ID":"5b90a283-7232-45a4-9326-e96a19d446fa","Type":"ContainerDied","Data":"5488f8b19c8f4cdadda3ab74a495729c6fc97887507c6ed2d38800616ca86f43"} Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.689841 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5488f8b19c8f4cdadda3ab74a495729c6fc97887507c6ed2d38800616ca86f43" Dec 05 20:24:32 crc kubenswrapper[4904]: I1205 20:24:32.689854 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d" Dec 05 20:24:33 crc kubenswrapper[4904]: I1205 20:24:33.696945 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerStarted","Data":"c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4"} Dec 05 20:24:38 crc kubenswrapper[4904]: I1205 20:24:38.523040 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:38 crc kubenswrapper[4904]: I1205 20:24:38.523308 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:39 crc kubenswrapper[4904]: I1205 20:24:39.637870 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvhpt" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="registry-server" probeResult="failure" output=< Dec 05 20:24:39 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 20:24:39 crc kubenswrapper[4904]: > Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.633676 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvhpt" podStartSLOduration=10.195451269 podStartE2EDuration="13.633657869s" podCreationTimestamp="2025-12-05 20:24:28 +0000 UTC" firstStartedPulling="2025-12-05 20:24:29.657261676 +0000 UTC m=+768.468477785" lastFinishedPulling="2025-12-05 20:24:33.095468276 +0000 UTC m=+771.906684385" observedRunningTime="2025-12-05 20:24:33.724018047 +0000 UTC m=+772.535234186" watchObservedRunningTime="2025-12-05 20:24:41.633657869 +0000 UTC m=+780.444873978" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.636378 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d"] Dec 05 20:24:41 crc kubenswrapper[4904]: E1205 20:24:41.636604 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="extract" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.636624 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="extract" Dec 05 20:24:41 crc kubenswrapper[4904]: E1205 20:24:41.636839 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="util" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.636847 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="util" Dec 05 20:24:41 crc kubenswrapper[4904]: E1205 20:24:41.636865 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="pull" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.636873 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="pull" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.636966 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b90a283-7232-45a4-9326-e96a19d446fa" containerName="extract" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.637371 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.639769 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.639949 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rf6qx" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.640030 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.661080 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.679258 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwdk6\" (UniqueName: \"kubernetes.io/projected/c290c96d-3acd-4374-91d7-20efcef53eda-kube-api-access-cwdk6\") pod \"obo-prometheus-operator-668cf9dfbb-52c6d\" (UID: \"c290c96d-3acd-4374-91d7-20efcef53eda\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.778416 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.779249 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.780206 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwdk6\" (UniqueName: \"kubernetes.io/projected/c290c96d-3acd-4374-91d7-20efcef53eda-kube-api-access-cwdk6\") pod \"obo-prometheus-operator-668cf9dfbb-52c6d\" (UID: \"c290c96d-3acd-4374-91d7-20efcef53eda\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.782411 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.782744 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-n6mx8" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.791241 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.792090 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.809324 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.812308 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwdk6\" (UniqueName: \"kubernetes.io/projected/c290c96d-3acd-4374-91d7-20efcef53eda-kube-api-access-cwdk6\") pod \"obo-prometheus-operator-668cf9dfbb-52c6d\" (UID: \"c290c96d-3acd-4374-91d7-20efcef53eda\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.828690 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.880908 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b774de0-c1b7-43c1-86b5-b444cc0275d4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm\" (UID: \"6b774de0-c1b7-43c1-86b5-b444cc0275d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.880971 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b774de0-c1b7-43c1-86b5-b444cc0275d4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm\" (UID: \"6b774de0-c1b7-43c1-86b5-b444cc0275d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.881076 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a54452d0-1ba5-4b81-aab4-2e2f2293fa6b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44\" (UID: \"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.881113 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a54452d0-1ba5-4b81-aab4-2e2f2293fa6b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44\" (UID: \"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.959695 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-vg7c2"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.960560 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.963681 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.963867 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gh9hb" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.974704 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-vg7c2"] Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.982417 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a54452d0-1ba5-4b81-aab4-2e2f2293fa6b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44\" (UID: \"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.982496 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a54452d0-1ba5-4b81-aab4-2e2f2293fa6b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44\" (UID: \"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.982575 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/14ec1958-2889-4fef-90ee-e73296264291-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-vg7c2\" (UID: \"14ec1958-2889-4fef-90ee-e73296264291\") " pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.982606 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b774de0-c1b7-43c1-86b5-b444cc0275d4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm\" (UID: \"6b774de0-c1b7-43c1-86b5-b444cc0275d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.982638 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b774de0-c1b7-43c1-86b5-b444cc0275d4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm\" (UID: \"6b774de0-c1b7-43c1-86b5-b444cc0275d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.982665 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sts77\" (UniqueName: \"kubernetes.io/projected/14ec1958-2889-4fef-90ee-e73296264291-kube-api-access-sts77\") pod \"observability-operator-d8bb48f5d-vg7c2\" (UID: \"14ec1958-2889-4fef-90ee-e73296264291\") " pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.986489 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a54452d0-1ba5-4b81-aab4-2e2f2293fa6b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44\" (UID: \"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.989149 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a54452d0-1ba5-4b81-aab4-2e2f2293fa6b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44\" (UID: \"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:41 crc kubenswrapper[4904]: I1205 20:24:41.989495 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b774de0-c1b7-43c1-86b5-b444cc0275d4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm\" (UID: \"6b774de0-c1b7-43c1-86b5-b444cc0275d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.001774 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b774de0-c1b7-43c1-86b5-b444cc0275d4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm\" (UID: \"6b774de0-c1b7-43c1-86b5-b444cc0275d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.011269 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.072391 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-8pj8n"] Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.072983 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.076399 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-f5qqk" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.084648 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/14ec1958-2889-4fef-90ee-e73296264291-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-vg7c2\" (UID: \"14ec1958-2889-4fef-90ee-e73296264291\") " pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.084692 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sts77\" (UniqueName: \"kubernetes.io/projected/14ec1958-2889-4fef-90ee-e73296264291-kube-api-access-sts77\") pod \"observability-operator-d8bb48f5d-vg7c2\" (UID: \"14ec1958-2889-4fef-90ee-e73296264291\") " pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.088353 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/14ec1958-2889-4fef-90ee-e73296264291-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-vg7c2\" (UID: \"14ec1958-2889-4fef-90ee-e73296264291\") " pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.091021 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-8pj8n"] Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.103294 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.109674 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sts77\" (UniqueName: \"kubernetes.io/projected/14ec1958-2889-4fef-90ee-e73296264291-kube-api-access-sts77\") pod \"observability-operator-d8bb48f5d-vg7c2\" (UID: \"14ec1958-2889-4fef-90ee-e73296264291\") " pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.142297 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.186339 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b-openshift-service-ca\") pod \"perses-operator-5446b9c989-8pj8n\" (UID: \"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b\") " pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.186414 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5f6\" (UniqueName: \"kubernetes.io/projected/7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b-kube-api-access-qq5f6\") pod \"perses-operator-5446b9c989-8pj8n\" (UID: \"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b\") " pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.279026 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.288713 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b-openshift-service-ca\") pod \"perses-operator-5446b9c989-8pj8n\" (UID: \"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b\") " pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.288779 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5f6\" (UniqueName: \"kubernetes.io/projected/7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b-kube-api-access-qq5f6\") pod \"perses-operator-5446b9c989-8pj8n\" (UID: \"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b\") " pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.290103 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b-openshift-service-ca\") pod \"perses-operator-5446b9c989-8pj8n\" (UID: \"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b\") " pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.318721 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5f6\" (UniqueName: \"kubernetes.io/projected/7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b-kube-api-access-qq5f6\") pod \"perses-operator-5446b9c989-8pj8n\" (UID: \"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b\") " pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.392200 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.404656 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d"] Dec 05 20:24:42 crc kubenswrapper[4904]: W1205 20:24:42.413143 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc290c96d_3acd_4374_91d7_20efcef53eda.slice/crio-be77c189f71f5d39e06f966cd3bfd0e1030450bb795fc1b926c715238562d4b0 WatchSource:0}: Error finding container be77c189f71f5d39e06f966cd3bfd0e1030450bb795fc1b926c715238562d4b0: Status 404 returned error can't find the container with id be77c189f71f5d39e06f966cd3bfd0e1030450bb795fc1b926c715238562d4b0 Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.678882 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-vg7c2"] Dec 05 20:24:42 crc kubenswrapper[4904]: W1205 20:24:42.680878 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14ec1958_2889_4fef_90ee_e73296264291.slice/crio-93739674786e973be23c7468cfe162e914a018bb89b59559951a142d335f3e49 WatchSource:0}: Error finding container 93739674786e973be23c7468cfe162e914a018bb89b59559951a142d335f3e49: Status 404 returned error can't find the container with id 93739674786e973be23c7468cfe162e914a018bb89b59559951a142d335f3e49 Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.735275 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm"] Dec 05 20:24:42 crc kubenswrapper[4904]: W1205 20:24:42.740912 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b774de0_c1b7_43c1_86b5_b444cc0275d4.slice/crio-99c5aec63e5209358bf15aed034661b4b92186853175c9b7b784c9f64882a66c WatchSource:0}: Error finding container 99c5aec63e5209358bf15aed034661b4b92186853175c9b7b784c9f64882a66c: Status 404 returned error can't find the container with id 99c5aec63e5209358bf15aed034661b4b92186853175c9b7b784c9f64882a66c Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.745891 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" event={"ID":"c290c96d-3acd-4374-91d7-20efcef53eda","Type":"ContainerStarted","Data":"be77c189f71f5d39e06f966cd3bfd0e1030450bb795fc1b926c715238562d4b0"} Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.747981 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" event={"ID":"14ec1958-2889-4fef-90ee-e73296264291","Type":"ContainerStarted","Data":"93739674786e973be23c7468cfe162e914a018bb89b59559951a142d335f3e49"} Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.804768 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44"] Dec 05 20:24:42 crc kubenswrapper[4904]: W1205 20:24:42.812563 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54452d0_1ba5_4b81_aab4_2e2f2293fa6b.slice/crio-ac9a36e586e2bef531aa2e8173c08bee7500197fff9ed9040ebf7ad924876072 WatchSource:0}: Error finding container ac9a36e586e2bef531aa2e8173c08bee7500197fff9ed9040ebf7ad924876072: Status 404 returned error can't find the container with id ac9a36e586e2bef531aa2e8173c08bee7500197fff9ed9040ebf7ad924876072 Dec 05 20:24:42 crc kubenswrapper[4904]: I1205 20:24:42.912596 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-8pj8n"] Dec 05 20:24:43 crc kubenswrapper[4904]: I1205 20:24:43.754936 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" event={"ID":"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b","Type":"ContainerStarted","Data":"ac9a36e586e2bef531aa2e8173c08bee7500197fff9ed9040ebf7ad924876072"} Dec 05 20:24:43 crc kubenswrapper[4904]: I1205 20:24:43.756105 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" event={"ID":"6b774de0-c1b7-43c1-86b5-b444cc0275d4","Type":"ContainerStarted","Data":"99c5aec63e5209358bf15aed034661b4b92186853175c9b7b784c9f64882a66c"} Dec 05 20:24:43 crc kubenswrapper[4904]: I1205 20:24:43.756928 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" event={"ID":"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b","Type":"ContainerStarted","Data":"682b013598df279bbe9d0c0c9dba1e59ff707bbcd771664760a1ae2b15b7d6ff"} Dec 05 20:24:48 crc kubenswrapper[4904]: I1205 20:24:48.609823 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:48 crc kubenswrapper[4904]: I1205 20:24:48.659204 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:24:48 crc kubenswrapper[4904]: I1205 20:24:48.842413 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvhpt"] Dec 05 20:24:49 crc kubenswrapper[4904]: I1205 20:24:49.821957 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvhpt" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="registry-server" containerID="cri-o://c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4" gracePeriod=2 Dec 05 20:24:50 crc kubenswrapper[4904]: I1205 20:24:50.835123 4904 generic.go:334] "Generic (PLEG): container finished" podID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerID="c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4" exitCode=0 Dec 05 20:24:50 crc kubenswrapper[4904]: I1205 20:24:50.835175 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerDied","Data":"c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4"} Dec 05 20:24:58 crc kubenswrapper[4904]: E1205 20:24:58.812608 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4 is running failed: container process not found" containerID="c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 20:24:58 crc kubenswrapper[4904]: E1205 20:24:58.813327 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4 is running failed: container process not found" containerID="c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 20:24:58 crc kubenswrapper[4904]: E1205 20:24:58.813783 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4 is running failed: container process not found" containerID="c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 20:24:58 crc kubenswrapper[4904]: E1205 20:24:58.813819 4904 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zvhpt" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="registry-server" Dec 05 20:24:59 crc kubenswrapper[4904]: I1205 20:24:59.956174 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:24:59 crc kubenswrapper[4904]: I1205 20:24:59.956229 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.275562 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.276077 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sts77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-vg7c2_openshift-operators(14ec1958-2889-4fef-90ee-e73296264291): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.277735 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" podUID="14ec1958-2889-4fef-90ee-e73296264291" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.328301 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.374348 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-catalog-content\") pod \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.374574 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7np45\" (UniqueName: \"kubernetes.io/projected/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-kube-api-access-7np45\") pod \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.374610 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-utilities\") pod \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\" (UID: \"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055\") " Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.375511 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-utilities" (OuterVolumeSpecName: "utilities") pod "d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" (UID: "d14e4db0-7b71-4e7f-90dc-b8acbdfb7055"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.397263 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-kube-api-access-7np45" (OuterVolumeSpecName: "kube-api-access-7np45") pod "d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" (UID: "d14e4db0-7b71-4e7f-90dc-b8acbdfb7055"). InnerVolumeSpecName "kube-api-access-7np45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.476326 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7np45\" (UniqueName: \"kubernetes.io/projected/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-kube-api-access-7np45\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.476353 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.489040 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" (UID: "d14e4db0-7b71-4e7f-90dc-b8acbdfb7055"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.577105 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.857212 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.857406 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qq5f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-8pj8n_openshift-operators(7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.858637 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" podUID="7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.965843 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhpt" event={"ID":"d14e4db0-7b71-4e7f-90dc-b8acbdfb7055","Type":"ContainerDied","Data":"5a55c3a96dc6585f17025165884d46107a44d7957bb393654f9cbd5b2b9cd4f4"} Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.965913 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhpt" Dec 05 20:25:03 crc kubenswrapper[4904]: I1205 20:25:03.965917 4904 scope.go:117] "RemoveContainer" containerID="c3766783663ca7404885410b66960da7d968bcead8a9e2820501a6a4333600d4" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.981232 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" podUID="7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b" Dec 05 20:25:03 crc kubenswrapper[4904]: E1205 20:25:03.983910 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" podUID="14ec1958-2889-4fef-90ee-e73296264291" Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.048912 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvhpt"] Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.053975 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvhpt"] Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.491371 4904 scope.go:117] "RemoveContainer" containerID="b9a288b480ffb634f8e49d8f051fba4a38375ae29039fecad17ce49258182e6b" Dec 05 20:25:04 crc kubenswrapper[4904]: E1205 20:25:04.495151 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 05 20:25:04 crc kubenswrapper[4904]: E1205 20:25:04.495286 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cwdk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-52c6d_openshift-operators(c290c96d-3acd-4374-91d7-20efcef53eda): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:25:04 crc kubenswrapper[4904]: E1205 20:25:04.496406 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" podUID="c290c96d-3acd-4374-91d7-20efcef53eda" Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.513585 4904 scope.go:117] "RemoveContainer" containerID="bc9bd875a441d78bcb87aaf91163f4da9cdd295baa31787d22de29a9e3d8b8f9" Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.973219 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" event={"ID":"a54452d0-1ba5-4b81-aab4-2e2f2293fa6b","Type":"ContainerStarted","Data":"a0b389dc406fb332da56754156cac5192a4af9ef5127ba88218c9a66d3e1dc26"} Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.975212 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" event={"ID":"6b774de0-c1b7-43c1-86b5-b444cc0275d4","Type":"ContainerStarted","Data":"0167a4780a3234573053cff0e1fe54d8e00288675d9b7489c7500cdc4a3ec32c"} Dec 05 20:25:04 crc kubenswrapper[4904]: E1205 20:25:04.977615 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" podUID="c290c96d-3acd-4374-91d7-20efcef53eda" Dec 05 20:25:04 crc kubenswrapper[4904]: I1205 20:25:04.999415 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44" podStartSLOduration=2.3088443229999998 podStartE2EDuration="23.999392142s" podCreationTimestamp="2025-12-05 20:24:41 +0000 UTC" firstStartedPulling="2025-12-05 20:24:42.814816448 +0000 UTC m=+781.626032557" lastFinishedPulling="2025-12-05 20:25:04.505364267 +0000 UTC m=+803.316580376" observedRunningTime="2025-12-05 20:25:04.994460013 +0000 UTC m=+803.805676142" watchObservedRunningTime="2025-12-05 20:25:04.999392142 +0000 UTC m=+803.810608291" Dec 05 20:25:05 crc kubenswrapper[4904]: I1205 20:25:05.051974 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm" podStartSLOduration=2.281481741 podStartE2EDuration="24.051956766s" podCreationTimestamp="2025-12-05 20:24:41 +0000 UTC" firstStartedPulling="2025-12-05 20:24:42.743951338 +0000 UTC m=+781.555167447" lastFinishedPulling="2025-12-05 20:25:04.514426363 +0000 UTC m=+803.325642472" observedRunningTime="2025-12-05 20:25:05.049075125 +0000 UTC m=+803.860291254" watchObservedRunningTime="2025-12-05 20:25:05.051956766 +0000 UTC m=+803.863172875" Dec 05 20:25:05 crc kubenswrapper[4904]: I1205 20:25:05.686401 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" path="/var/lib/kubelet/pods/d14e4db0-7b71-4e7f-90dc-b8acbdfb7055/volumes" Dec 05 20:25:17 crc kubenswrapper[4904]: I1205 20:25:17.048884 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" event={"ID":"14ec1958-2889-4fef-90ee-e73296264291","Type":"ContainerStarted","Data":"9378528b286542854c0b962e4f60c3ef7506f2e1a6a2542e583eecbc02fc819a"} Dec 05 20:25:17 crc kubenswrapper[4904]: I1205 20:25:17.049661 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:25:17 crc kubenswrapper[4904]: I1205 20:25:17.079572 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" podStartSLOduration=2.244162277 podStartE2EDuration="36.079553422s" podCreationTimestamp="2025-12-05 20:24:41 +0000 UTC" firstStartedPulling="2025-12-05 20:24:42.683151712 +0000 UTC m=+781.494367821" lastFinishedPulling="2025-12-05 20:25:16.518542857 +0000 UTC m=+815.329758966" observedRunningTime="2025-12-05 20:25:17.075799946 +0000 UTC m=+815.887016075" watchObservedRunningTime="2025-12-05 20:25:17.079553422 +0000 UTC m=+815.890769531" Dec 05 20:25:17 crc kubenswrapper[4904]: I1205 20:25:17.097196 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-vg7c2" Dec 05 20:25:19 crc kubenswrapper[4904]: I1205 20:25:19.061156 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" event={"ID":"c290c96d-3acd-4374-91d7-20efcef53eda","Type":"ContainerStarted","Data":"61b2303c048d5b890b2ef0e7127faa1a09d348e7849eafff80fd209fb1510482"} Dec 05 20:25:19 crc kubenswrapper[4904]: I1205 20:25:19.077395 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-52c6d" podStartSLOduration=1.8888571079999998 podStartE2EDuration="38.077374763s" podCreationTimestamp="2025-12-05 20:24:41 +0000 UTC" firstStartedPulling="2025-12-05 20:24:42.430313535 +0000 UTC m=+781.241529644" lastFinishedPulling="2025-12-05 20:25:18.61883119 +0000 UTC m=+817.430047299" observedRunningTime="2025-12-05 20:25:19.075943172 +0000 UTC m=+817.887159291" watchObservedRunningTime="2025-12-05 20:25:19.077374763 +0000 UTC m=+817.888590872" Dec 05 20:25:20 crc kubenswrapper[4904]: I1205 20:25:20.069087 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" event={"ID":"7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b","Type":"ContainerStarted","Data":"911d893a3316e67e574bb9a9f6b28aafe93805299217a7eb0a6861bc1a89039c"} Dec 05 20:25:20 crc kubenswrapper[4904]: I1205 20:25:20.069399 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:25:20 crc kubenswrapper[4904]: I1205 20:25:20.088388 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" podStartSLOduration=1.547277753 podStartE2EDuration="38.08836985s" podCreationTimestamp="2025-12-05 20:24:42 +0000 UTC" firstStartedPulling="2025-12-05 20:24:42.926403938 +0000 UTC m=+781.737620047" lastFinishedPulling="2025-12-05 20:25:19.467496035 +0000 UTC m=+818.278712144" observedRunningTime="2025-12-05 20:25:20.087195286 +0000 UTC m=+818.898411425" watchObservedRunningTime="2025-12-05 20:25:20.08836985 +0000 UTC m=+818.899585959" Dec 05 20:25:29 crc kubenswrapper[4904]: I1205 20:25:29.956098 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:25:29 crc kubenswrapper[4904]: I1205 20:25:29.957359 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:25:32 crc kubenswrapper[4904]: I1205 20:25:32.395535 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-8pj8n" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.474305 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw"] Dec 05 20:25:51 crc kubenswrapper[4904]: E1205 20:25:51.475086 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="extract-content" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.475102 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="extract-content" Dec 05 20:25:51 crc kubenswrapper[4904]: E1205 20:25:51.475117 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="registry-server" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.475126 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="registry-server" Dec 05 20:25:51 crc kubenswrapper[4904]: E1205 20:25:51.475141 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="extract-utilities" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.475150 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="extract-utilities" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.475279 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14e4db0-7b71-4e7f-90dc-b8acbdfb7055" containerName="registry-server" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.476231 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.480332 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.484603 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw"] Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.580144 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.580228 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg7vt\" (UniqueName: \"kubernetes.io/projected/397a4321-b8b2-4041-9219-a5f837937346-kube-api-access-pg7vt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.580293 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.682795 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.682874 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg7vt\" (UniqueName: \"kubernetes.io/projected/397a4321-b8b2-4041-9219-a5f837937346-kube-api-access-pg7vt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.682919 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.683424 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.683669 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.703368 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg7vt\" (UniqueName: \"kubernetes.io/projected/397a4321-b8b2-4041-9219-a5f837937346-kube-api-access-pg7vt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:51 crc kubenswrapper[4904]: I1205 20:25:51.794205 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:52 crc kubenswrapper[4904]: I1205 20:25:52.045385 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw"] Dec 05 20:25:52 crc kubenswrapper[4904]: I1205 20:25:52.249593 4904 generic.go:334] "Generic (PLEG): container finished" podID="397a4321-b8b2-4041-9219-a5f837937346" containerID="d56f6f993834d35655511f8f317639da232e2f36473e31c746cb3aef322d2836" exitCode=0 Dec 05 20:25:52 crc kubenswrapper[4904]: I1205 20:25:52.249795 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" event={"ID":"397a4321-b8b2-4041-9219-a5f837937346","Type":"ContainerDied","Data":"d56f6f993834d35655511f8f317639da232e2f36473e31c746cb3aef322d2836"} Dec 05 20:25:52 crc kubenswrapper[4904]: I1205 20:25:52.250735 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" event={"ID":"397a4321-b8b2-4041-9219-a5f837937346","Type":"ContainerStarted","Data":"127a13d852ac3924fcb5ee4cfe77424d866ef6bcde6cfe98ba44a110bf6df430"} Dec 05 20:25:54 crc kubenswrapper[4904]: I1205 20:25:54.275032 4904 generic.go:334] "Generic (PLEG): container finished" podID="397a4321-b8b2-4041-9219-a5f837937346" containerID="08c578c53f3e631ca0386988436a1b0395d41a96ef615d7fe4211c437d21c2f1" exitCode=0 Dec 05 20:25:54 crc kubenswrapper[4904]: I1205 20:25:54.275102 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" event={"ID":"397a4321-b8b2-4041-9219-a5f837937346","Type":"ContainerDied","Data":"08c578c53f3e631ca0386988436a1b0395d41a96ef615d7fe4211c437d21c2f1"} Dec 05 20:25:55 crc kubenswrapper[4904]: I1205 20:25:55.284346 4904 generic.go:334] "Generic (PLEG): container finished" podID="397a4321-b8b2-4041-9219-a5f837937346" containerID="1f331ad0d8a4ea3146d2fcfc2ae0f5d6c34fa50beee1f0df2f076eed5ebe03dc" exitCode=0 Dec 05 20:25:55 crc kubenswrapper[4904]: I1205 20:25:55.284493 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" event={"ID":"397a4321-b8b2-4041-9219-a5f837937346","Type":"ContainerDied","Data":"1f331ad0d8a4ea3146d2fcfc2ae0f5d6c34fa50beee1f0df2f076eed5ebe03dc"} Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.557515 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.642887 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-util\") pod \"397a4321-b8b2-4041-9219-a5f837937346\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.642939 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-bundle\") pod \"397a4321-b8b2-4041-9219-a5f837937346\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.642993 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg7vt\" (UniqueName: \"kubernetes.io/projected/397a4321-b8b2-4041-9219-a5f837937346-kube-api-access-pg7vt\") pod \"397a4321-b8b2-4041-9219-a5f837937346\" (UID: \"397a4321-b8b2-4041-9219-a5f837937346\") " Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.648336 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397a4321-b8b2-4041-9219-a5f837937346-kube-api-access-pg7vt" (OuterVolumeSpecName: "kube-api-access-pg7vt") pod "397a4321-b8b2-4041-9219-a5f837937346" (UID: "397a4321-b8b2-4041-9219-a5f837937346"). InnerVolumeSpecName "kube-api-access-pg7vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.648722 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-bundle" (OuterVolumeSpecName: "bundle") pod "397a4321-b8b2-4041-9219-a5f837937346" (UID: "397a4321-b8b2-4041-9219-a5f837937346"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.661013 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-util" (OuterVolumeSpecName: "util") pod "397a4321-b8b2-4041-9219-a5f837937346" (UID: "397a4321-b8b2-4041-9219-a5f837937346"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.743826 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg7vt\" (UniqueName: \"kubernetes.io/projected/397a4321-b8b2-4041-9219-a5f837937346-kube-api-access-pg7vt\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.743860 4904 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:56 crc kubenswrapper[4904]: I1205 20:25:56.743868 4904 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/397a4321-b8b2-4041-9219-a5f837937346-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:57 crc kubenswrapper[4904]: I1205 20:25:57.300554 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" event={"ID":"397a4321-b8b2-4041-9219-a5f837937346","Type":"ContainerDied","Data":"127a13d852ac3924fcb5ee4cfe77424d866ef6bcde6cfe98ba44a110bf6df430"} Dec 05 20:25:57 crc kubenswrapper[4904]: I1205 20:25:57.300596 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127a13d852ac3924fcb5ee4cfe77424d866ef6bcde6cfe98ba44a110bf6df430" Dec 05 20:25:57 crc kubenswrapper[4904]: I1205 20:25:57.300613 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.394004 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947"] Dec 05 20:25:58 crc kubenswrapper[4904]: E1205 20:25:58.394291 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="pull" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.394307 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="pull" Dec 05 20:25:58 crc kubenswrapper[4904]: E1205 20:25:58.394324 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="extract" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.394333 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="extract" Dec 05 20:25:58 crc kubenswrapper[4904]: E1205 20:25:58.394343 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="util" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.394350 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="util" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.394465 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="397a4321-b8b2-4041-9219-a5f837937346" containerName="extract" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.394936 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.397503 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.397714 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.398673 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-skg5m" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.407550 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947"] Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.465236 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgdw4\" (UniqueName: \"kubernetes.io/projected/a66ea824-d482-41b2-8ddc-5ee70d24db5a-kube-api-access-jgdw4\") pod \"nmstate-operator-5b5b58f5c8-xg947\" (UID: \"a66ea824-d482-41b2-8ddc-5ee70d24db5a\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.566412 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgdw4\" (UniqueName: \"kubernetes.io/projected/a66ea824-d482-41b2-8ddc-5ee70d24db5a-kube-api-access-jgdw4\") pod \"nmstate-operator-5b5b58f5c8-xg947\" (UID: \"a66ea824-d482-41b2-8ddc-5ee70d24db5a\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.590308 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgdw4\" (UniqueName: \"kubernetes.io/projected/a66ea824-d482-41b2-8ddc-5ee70d24db5a-kube-api-access-jgdw4\") pod \"nmstate-operator-5b5b58f5c8-xg947\" (UID: \"a66ea824-d482-41b2-8ddc-5ee70d24db5a\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" Dec 05 20:25:58 crc kubenswrapper[4904]: I1205 20:25:58.710832 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.135155 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947"] Dec 05 20:25:59 crc kubenswrapper[4904]: W1205 20:25:59.145229 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda66ea824_d482_41b2_8ddc_5ee70d24db5a.slice/crio-5ce4c9ac0dc5d305ca72096c38af0a609aa40d2665448c4ee084c6829523995e WatchSource:0}: Error finding container 5ce4c9ac0dc5d305ca72096c38af0a609aa40d2665448c4ee084c6829523995e: Status 404 returned error can't find the container with id 5ce4c9ac0dc5d305ca72096c38af0a609aa40d2665448c4ee084c6829523995e Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.312284 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" event={"ID":"a66ea824-d482-41b2-8ddc-5ee70d24db5a","Type":"ContainerStarted","Data":"5ce4c9ac0dc5d305ca72096c38af0a609aa40d2665448c4ee084c6829523995e"} Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.955822 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.955873 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.955911 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.956454 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8714f5a82d22d92591b2585f5d959b3d2c6d45de1c2f0d4263e0b6b75477d050"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:25:59 crc kubenswrapper[4904]: I1205 20:25:59.956510 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://8714f5a82d22d92591b2585f5d959b3d2c6d45de1c2f0d4263e0b6b75477d050" gracePeriod=600 Dec 05 20:26:00 crc kubenswrapper[4904]: I1205 20:26:00.333901 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="8714f5a82d22d92591b2585f5d959b3d2c6d45de1c2f0d4263e0b6b75477d050" exitCode=0 Dec 05 20:26:00 crc kubenswrapper[4904]: I1205 20:26:00.333982 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"8714f5a82d22d92591b2585f5d959b3d2c6d45de1c2f0d4263e0b6b75477d050"} Dec 05 20:26:00 crc kubenswrapper[4904]: I1205 20:26:00.334328 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"3a6fee22a5899dc7491b94f9ec1bfdefcfd49d1d911077fc82ba25da5a725a66"} Dec 05 20:26:00 crc kubenswrapper[4904]: I1205 20:26:00.334386 4904 scope.go:117] "RemoveContainer" containerID="4acc0855bb68cb8c598e7e35234dd87f785f666d2488ee58dffc84313152e984" Dec 05 20:26:02 crc kubenswrapper[4904]: I1205 20:26:02.358909 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" event={"ID":"a66ea824-d482-41b2-8ddc-5ee70d24db5a","Type":"ContainerStarted","Data":"ffe84b4ce27a701971aedf35f6d4bfaf370af0b364be706aba86ab0da6bd969f"} Dec 05 20:26:02 crc kubenswrapper[4904]: I1205 20:26:02.393231 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-xg947" podStartSLOduration=2.410891566 podStartE2EDuration="4.39320959s" podCreationTimestamp="2025-12-05 20:25:58 +0000 UTC" firstStartedPulling="2025-12-05 20:25:59.148048939 +0000 UTC m=+857.959265048" lastFinishedPulling="2025-12-05 20:26:01.130366963 +0000 UTC m=+859.941583072" observedRunningTime="2025-12-05 20:26:02.381368075 +0000 UTC m=+861.192584224" watchObservedRunningTime="2025-12-05 20:26:02.39320959 +0000 UTC m=+861.204425699" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.316417 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.317507 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.322666 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jjjrr" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.329694 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8qf2w"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.330662 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.338398 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjc7t\" (UniqueName: \"kubernetes.io/projected/b05e6bce-1fed-411b-9c7d-ea32260cb8dc-kube-api-access-cjc7t\") pod \"nmstate-metrics-7f946cbc9-94lfj\" (UID: \"b05e6bce-1fed-411b-9c7d-ea32260cb8dc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.338460 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-dbus-socket\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.338531 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-nmstate-lock\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.338562 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-ovs-socket\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.338614 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpdc\" (UniqueName: \"kubernetes.io/projected/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-kube-api-access-kvpdc\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.343438 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.352503 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.353368 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.355146 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.368045 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445566 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpdc\" (UniqueName: \"kubernetes.io/projected/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-kube-api-access-kvpdc\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445647 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b57lc\" (UniqueName: \"kubernetes.io/projected/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-kube-api-access-b57lc\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445705 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjc7t\" (UniqueName: \"kubernetes.io/projected/b05e6bce-1fed-411b-9c7d-ea32260cb8dc-kube-api-access-cjc7t\") pod \"nmstate-metrics-7f946cbc9-94lfj\" (UID: \"b05e6bce-1fed-411b-9c7d-ea32260cb8dc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445742 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-dbus-socket\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445790 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445822 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-nmstate-lock\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445858 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-ovs-socket\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.445956 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-ovs-socket\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.446360 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-nmstate-lock\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.450714 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-dbus-socket\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.460870 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.461778 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.464088 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qnlwt" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.464283 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.465659 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.468511 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.474605 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjc7t\" (UniqueName: \"kubernetes.io/projected/b05e6bce-1fed-411b-9c7d-ea32260cb8dc-kube-api-access-cjc7t\") pod \"nmstate-metrics-7f946cbc9-94lfj\" (UID: \"b05e6bce-1fed-411b-9c7d-ea32260cb8dc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.478765 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpdc\" (UniqueName: \"kubernetes.io/projected/b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903-kube-api-access-kvpdc\") pod \"nmstate-handler-8qf2w\" (UID: \"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903\") " pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.546694 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhmn\" (UniqueName: \"kubernetes.io/projected/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-kube-api-access-bmhmn\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.546742 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b57lc\" (UniqueName: \"kubernetes.io/projected/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-kube-api-access-b57lc\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.546787 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.546809 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.546826 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:03 crc kubenswrapper[4904]: E1205 20:26:03.547010 4904 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 05 20:26:03 crc kubenswrapper[4904]: E1205 20:26:03.547104 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-tls-key-pair podName:1f407364-e4d6-4506-abaa-f4e3ae5ab29f nodeName:}" failed. No retries permitted until 2025-12-05 20:26:04.047082147 +0000 UTC m=+862.858298256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-7dl4s" (UID: "1f407364-e4d6-4506-abaa-f4e3ae5ab29f") : secret "openshift-nmstate-webhook" not found Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.562519 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b57lc\" (UniqueName: \"kubernetes.io/projected/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-kube-api-access-b57lc\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.636255 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.648891 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.649002 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhmn\" (UniqueName: \"kubernetes.io/projected/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-kube-api-access-bmhmn\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.649174 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.649898 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.652501 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.658171 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.683390 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhmn\" (UniqueName: \"kubernetes.io/projected/ffe1cf37-52b4-4493-bf5b-f0318a5015a9-kube-api-access-bmhmn\") pod \"nmstate-console-plugin-7fbb5f6569-lqpqd\" (UID: \"ffe1cf37-52b4-4493-bf5b-f0318a5015a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.704114 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-788668d987-f9cth"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.705031 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.717565 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-788668d987-f9cth"] Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.749939 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3da0e4e2-34f7-4153-86c3-80c475e94498-console-serving-cert\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.749999 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-console-config\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.750179 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-service-ca\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.750498 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-trusted-ca-bundle\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.750585 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9nvz\" (UniqueName: \"kubernetes.io/projected/3da0e4e2-34f7-4153-86c3-80c475e94498-kube-api-access-d9nvz\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.750614 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3da0e4e2-34f7-4153-86c3-80c475e94498-console-oauth-config\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.750657 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-oauth-serving-cert\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.815090 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.851823 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-service-ca\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.851887 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-trusted-ca-bundle\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.851927 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9nvz\" (UniqueName: \"kubernetes.io/projected/3da0e4e2-34f7-4153-86c3-80c475e94498-kube-api-access-d9nvz\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.851959 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3da0e4e2-34f7-4153-86c3-80c475e94498-console-oauth-config\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.852001 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-oauth-serving-cert\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.852028 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3da0e4e2-34f7-4153-86c3-80c475e94498-console-serving-cert\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.852125 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-console-config\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.853419 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-console-config\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.854029 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-service-ca\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.855924 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-trusted-ca-bundle\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.865373 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3da0e4e2-34f7-4153-86c3-80c475e94498-oauth-serving-cert\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.873849 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3da0e4e2-34f7-4153-86c3-80c475e94498-console-serving-cert\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.874394 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3da0e4e2-34f7-4153-86c3-80c475e94498-console-oauth-config\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.887613 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9nvz\" (UniqueName: \"kubernetes.io/projected/3da0e4e2-34f7-4153-86c3-80c475e94498-kube-api-access-d9nvz\") pod \"console-788668d987-f9cth\" (UID: \"3da0e4e2-34f7-4153-86c3-80c475e94498\") " pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:03 crc kubenswrapper[4904]: I1205 20:26:03.978047 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj"] Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.025028 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.058605 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.063892 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f407364-e4d6-4506-abaa-f4e3ae5ab29f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7dl4s\" (UID: \"1f407364-e4d6-4506-abaa-f4e3ae5ab29f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.066933 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd"] Dec 05 20:26:04 crc kubenswrapper[4904]: W1205 20:26:04.074337 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe1cf37_52b4_4493_bf5b_f0318a5015a9.slice/crio-40b10aac7b3e68c5d9089b899895e4b0a467a7ba0854ef54001e6a3e2332aa28 WatchSource:0}: Error finding container 40b10aac7b3e68c5d9089b899895e4b0a467a7ba0854ef54001e6a3e2332aa28: Status 404 returned error can't find the container with id 40b10aac7b3e68c5d9089b899895e4b0a467a7ba0854ef54001e6a3e2332aa28 Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.192737 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-788668d987-f9cth"] Dec 05 20:26:04 crc kubenswrapper[4904]: W1205 20:26:04.195334 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da0e4e2_34f7_4153_86c3_80c475e94498.slice/crio-55f494248382066fdf01e63816b2751381c98642a49a646090ab131abd1ea2ae WatchSource:0}: Error finding container 55f494248382066fdf01e63816b2751381c98642a49a646090ab131abd1ea2ae: Status 404 returned error can't find the container with id 55f494248382066fdf01e63816b2751381c98642a49a646090ab131abd1ea2ae Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.275445 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.371298 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" event={"ID":"ffe1cf37-52b4-4493-bf5b-f0318a5015a9","Type":"ContainerStarted","Data":"40b10aac7b3e68c5d9089b899895e4b0a467a7ba0854ef54001e6a3e2332aa28"} Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.373852 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-788668d987-f9cth" event={"ID":"3da0e4e2-34f7-4153-86c3-80c475e94498","Type":"ContainerStarted","Data":"3c41ac316cefd4fe7f56a23f59d5ee266b6e2ebcb4744938d38f1c336caddfa4"} Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.373882 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-788668d987-f9cth" event={"ID":"3da0e4e2-34f7-4153-86c3-80c475e94498","Type":"ContainerStarted","Data":"55f494248382066fdf01e63816b2751381c98642a49a646090ab131abd1ea2ae"} Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.381260 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8qf2w" event={"ID":"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903","Type":"ContainerStarted","Data":"d7ebd73a1fc515dfd99ad7e552ef64612b1b174d931917b2567e570f4a717d87"} Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.382459 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" event={"ID":"b05e6bce-1fed-411b-9c7d-ea32260cb8dc","Type":"ContainerStarted","Data":"98962ba63414cadc0dc698d3a98a72776b258c77f0b4b93ed429178b417d1d56"} Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.402476 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-788668d987-f9cth" podStartSLOduration=1.402457464 podStartE2EDuration="1.402457464s" podCreationTimestamp="2025-12-05 20:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:04.399633796 +0000 UTC m=+863.210849925" watchObservedRunningTime="2025-12-05 20:26:04.402457464 +0000 UTC m=+863.213673563" Dec 05 20:26:04 crc kubenswrapper[4904]: I1205 20:26:04.475897 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s"] Dec 05 20:26:04 crc kubenswrapper[4904]: W1205 20:26:04.482482 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f407364_e4d6_4506_abaa_f4e3ae5ab29f.slice/crio-35be1dff7738a8cceb7e1a4140ca85f05d8ed62157678982c5f3cc6f83e6ec31 WatchSource:0}: Error finding container 35be1dff7738a8cceb7e1a4140ca85f05d8ed62157678982c5f3cc6f83e6ec31: Status 404 returned error can't find the container with id 35be1dff7738a8cceb7e1a4140ca85f05d8ed62157678982c5f3cc6f83e6ec31 Dec 05 20:26:05 crc kubenswrapper[4904]: I1205 20:26:05.390182 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" event={"ID":"1f407364-e4d6-4506-abaa-f4e3ae5ab29f","Type":"ContainerStarted","Data":"35be1dff7738a8cceb7e1a4140ca85f05d8ed62157678982c5f3cc6f83e6ec31"} Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.403099 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" event={"ID":"b05e6bce-1fed-411b-9c7d-ea32260cb8dc","Type":"ContainerStarted","Data":"c32b4f77d3e2ee4377aa90fb3468a419538ebfb5aca20bd5b38a5022c92679b5"} Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.404552 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" event={"ID":"ffe1cf37-52b4-4493-bf5b-f0318a5015a9","Type":"ContainerStarted","Data":"1a18ea470782e84a57d8efe60139980467bf6eb5d2368dd39d6f4b12e3c3e18d"} Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.406340 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8qf2w" event={"ID":"b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903","Type":"ContainerStarted","Data":"fadaeb8d354177ad89ecb362c9ad60c90fd3a8e8742e0a22d399e88341cae3eb"} Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.406783 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.408483 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" event={"ID":"1f407364-e4d6-4506-abaa-f4e3ae5ab29f","Type":"ContainerStarted","Data":"487260f89647b729f08df781a7147cc7c571091297f8f9ad1c83565ccba7c659"} Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.408669 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.421709 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lqpqd" podStartSLOduration=1.495895287 podStartE2EDuration="4.421693097s" podCreationTimestamp="2025-12-05 20:26:03 +0000 UTC" firstStartedPulling="2025-12-05 20:26:04.077529139 +0000 UTC m=+862.888745248" lastFinishedPulling="2025-12-05 20:26:07.003326949 +0000 UTC m=+865.814543058" observedRunningTime="2025-12-05 20:26:07.418213132 +0000 UTC m=+866.229429241" watchObservedRunningTime="2025-12-05 20:26:07.421693097 +0000 UTC m=+866.232909206" Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.490830 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8qf2w" podStartSLOduration=1.165174485 podStartE2EDuration="4.490809644s" podCreationTimestamp="2025-12-05 20:26:03 +0000 UTC" firstStartedPulling="2025-12-05 20:26:03.678928034 +0000 UTC m=+862.490144143" lastFinishedPulling="2025-12-05 20:26:07.004563183 +0000 UTC m=+865.815779302" observedRunningTime="2025-12-05 20:26:07.486291039 +0000 UTC m=+866.297507158" watchObservedRunningTime="2025-12-05 20:26:07.490809644 +0000 UTC m=+866.302025743" Dec 05 20:26:07 crc kubenswrapper[4904]: I1205 20:26:07.513281 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" podStartSLOduration=1.99392274 podStartE2EDuration="4.513258609s" podCreationTimestamp="2025-12-05 20:26:03 +0000 UTC" firstStartedPulling="2025-12-05 20:26:04.484584787 +0000 UTC m=+863.295800896" lastFinishedPulling="2025-12-05 20:26:07.003920656 +0000 UTC m=+865.815136765" observedRunningTime="2025-12-05 20:26:07.511918592 +0000 UTC m=+866.323134741" watchObservedRunningTime="2025-12-05 20:26:07.513258609 +0000 UTC m=+866.324474718" Dec 05 20:26:10 crc kubenswrapper[4904]: I1205 20:26:10.429914 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" event={"ID":"b05e6bce-1fed-411b-9c7d-ea32260cb8dc","Type":"ContainerStarted","Data":"106cd500c18092f6601ddf2fce672b9d0e3f5fec15dd6b84bbe4d14b2893127d"} Dec 05 20:26:10 crc kubenswrapper[4904]: I1205 20:26:10.457621 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-94lfj" podStartSLOduration=1.710122425 podStartE2EDuration="7.457589348s" podCreationTimestamp="2025-12-05 20:26:03 +0000 UTC" firstStartedPulling="2025-12-05 20:26:03.993679489 +0000 UTC m=+862.804895598" lastFinishedPulling="2025-12-05 20:26:09.741146412 +0000 UTC m=+868.552362521" observedRunningTime="2025-12-05 20:26:10.451466899 +0000 UTC m=+869.262683018" watchObservedRunningTime="2025-12-05 20:26:10.457589348 +0000 UTC m=+869.268805497" Dec 05 20:26:13 crc kubenswrapper[4904]: I1205 20:26:13.689295 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8qf2w" Dec 05 20:26:14 crc kubenswrapper[4904]: I1205 20:26:14.025574 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:14 crc kubenswrapper[4904]: I1205 20:26:14.025961 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:14 crc kubenswrapper[4904]: I1205 20:26:14.033381 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:14 crc kubenswrapper[4904]: I1205 20:26:14.470878 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-788668d987-f9cth" Dec 05 20:26:14 crc kubenswrapper[4904]: I1205 20:26:14.554922 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pdnb5"] Dec 05 20:26:24 crc kubenswrapper[4904]: I1205 20:26:24.283930 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7dl4s" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.298593 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26"] Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.301627 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.310740 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.321943 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26"] Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.494292 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.494570 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfkzx\" (UniqueName: \"kubernetes.io/projected/074d2553-b276-41f9-ae52-d37209d033e3-kube-api-access-sfkzx\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.494666 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.596256 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfkzx\" (UniqueName: \"kubernetes.io/projected/074d2553-b276-41f9-ae52-d37209d033e3-kube-api-access-sfkzx\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.596344 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.596419 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.597328 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.598446 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.627629 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfkzx\" (UniqueName: \"kubernetes.io/projected/074d2553-b276-41f9-ae52-d37209d033e3-kube-api-access-sfkzx\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.631425 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:38 crc kubenswrapper[4904]: I1205 20:26:38.850984 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26"] Dec 05 20:26:39 crc kubenswrapper[4904]: I1205 20:26:39.602898 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pdnb5" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerName="console" containerID="cri-o://1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887" gracePeriod=15 Dec 05 20:26:39 crc kubenswrapper[4904]: I1205 20:26:39.623892 4904 generic.go:334] "Generic (PLEG): container finished" podID="074d2553-b276-41f9-ae52-d37209d033e3" containerID="6e422e92a50ea9168e6d1cd11399fe9b95d2b402bc9e0b6e60e3096ddfb21ce2" exitCode=0 Dec 05 20:26:39 crc kubenswrapper[4904]: I1205 20:26:39.623942 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" event={"ID":"074d2553-b276-41f9-ae52-d37209d033e3","Type":"ContainerDied","Data":"6e422e92a50ea9168e6d1cd11399fe9b95d2b402bc9e0b6e60e3096ddfb21ce2"} Dec 05 20:26:39 crc kubenswrapper[4904]: I1205 20:26:39.623975 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" event={"ID":"074d2553-b276-41f9-ae52-d37209d033e3","Type":"ContainerStarted","Data":"2ac04e87f13d4bdc10ee17924598fc655d3ff2f1b563b8ad8b77840123c99e56"} Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.000183 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pdnb5_b02e39c5-31b4-4444-a500-cd7cbe327bec/console/0.log" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.000538 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114093 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-oauth-config\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114153 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-service-ca\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114171 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-config\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114198 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-oauth-serving-cert\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114221 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-serving-cert\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114244 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-trusted-ca-bundle\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.114294 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckplz\" (UniqueName: \"kubernetes.io/projected/b02e39c5-31b4-4444-a500-cd7cbe327bec-kube-api-access-ckplz\") pod \"b02e39c5-31b4-4444-a500-cd7cbe327bec\" (UID: \"b02e39c5-31b4-4444-a500-cd7cbe327bec\") " Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.115110 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.115165 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-config" (OuterVolumeSpecName: "console-config") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.115226 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-service-ca" (OuterVolumeSpecName: "service-ca") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.115251 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.120905 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b02e39c5-31b4-4444-a500-cd7cbe327bec-kube-api-access-ckplz" (OuterVolumeSpecName: "kube-api-access-ckplz") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "kube-api-access-ckplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.122078 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.126894 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b02e39c5-31b4-4444-a500-cd7cbe327bec" (UID: "b02e39c5-31b4-4444-a500-cd7cbe327bec"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216468 4904 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216540 4904 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216556 4904 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216570 4904 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216584 4904 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02e39c5-31b4-4444-a500-cd7cbe327bec-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216597 4904 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b02e39c5-31b4-4444-a500-cd7cbe327bec-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.216610 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckplz\" (UniqueName: \"kubernetes.io/projected/b02e39c5-31b4-4444-a500-cd7cbe327bec-kube-api-access-ckplz\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.641293 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pdnb5_b02e39c5-31b4-4444-a500-cd7cbe327bec/console/0.log" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.641334 4904 generic.go:334] "Generic (PLEG): container finished" podID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerID="1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887" exitCode=2 Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.641362 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdnb5" event={"ID":"b02e39c5-31b4-4444-a500-cd7cbe327bec","Type":"ContainerDied","Data":"1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887"} Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.641390 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdnb5" event={"ID":"b02e39c5-31b4-4444-a500-cd7cbe327bec","Type":"ContainerDied","Data":"5b9594bdee326e7568932fd1ba992bbc93a7156a19a69da11b663ef7ac69d3fc"} Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.641410 4904 scope.go:117] "RemoveContainer" containerID="1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.641511 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdnb5" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.669824 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pdnb5"] Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.675112 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pdnb5"] Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.676352 4904 scope.go:117] "RemoveContainer" containerID="1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887" Dec 05 20:26:40 crc kubenswrapper[4904]: E1205 20:26:40.676904 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887\": container with ID starting with 1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887 not found: ID does not exist" containerID="1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887" Dec 05 20:26:40 crc kubenswrapper[4904]: I1205 20:26:40.676958 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887"} err="failed to get container status \"1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887\": rpc error: code = NotFound desc = could not find container \"1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887\": container with ID starting with 1fdc6e51f7378f59e5d7357d2f409353ec29b50be427af4f2b711dde4c9e1887 not found: ID does not exist" Dec 05 20:26:41 crc kubenswrapper[4904]: I1205 20:26:41.708280 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" path="/var/lib/kubelet/pods/b02e39c5-31b4-4444-a500-cd7cbe327bec/volumes" Dec 05 20:26:43 crc kubenswrapper[4904]: I1205 20:26:43.666726 4904 generic.go:334] "Generic (PLEG): container finished" podID="074d2553-b276-41f9-ae52-d37209d033e3" containerID="191be8080bafba028b5fca4cf7a27d518494776e863c3215da99f4fb1c8663d6" exitCode=0 Dec 05 20:26:43 crc kubenswrapper[4904]: I1205 20:26:43.666937 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" event={"ID":"074d2553-b276-41f9-ae52-d37209d033e3","Type":"ContainerDied","Data":"191be8080bafba028b5fca4cf7a27d518494776e863c3215da99f4fb1c8663d6"} Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.051942 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8nskr"] Dec 05 20:26:44 crc kubenswrapper[4904]: E1205 20:26:44.052187 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerName="console" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.052204 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerName="console" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.052309 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="b02e39c5-31b4-4444-a500-cd7cbe327bec" containerName="console" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.053090 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.070015 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nskr"] Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.166433 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-catalog-content\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.166762 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-utilities\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.166782 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqrz\" (UniqueName: \"kubernetes.io/projected/d422a85a-3a49-455b-bb05-4c931c24a960-kube-api-access-9mqrz\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.268162 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-utilities\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.268229 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqrz\" (UniqueName: \"kubernetes.io/projected/d422a85a-3a49-455b-bb05-4c931c24a960-kube-api-access-9mqrz\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.268323 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-catalog-content\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.268833 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-utilities\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.268881 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-catalog-content\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.287165 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqrz\" (UniqueName: \"kubernetes.io/projected/d422a85a-3a49-455b-bb05-4c931c24a960-kube-api-access-9mqrz\") pod \"community-operators-8nskr\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.378888 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.681968 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" event={"ID":"074d2553-b276-41f9-ae52-d37209d033e3","Type":"ContainerStarted","Data":"0581a9f995053294b630986e81eaf551ac4c89a594325a20b6271fe4a7263a7a"} Dec 05 20:26:44 crc kubenswrapper[4904]: I1205 20:26:44.880746 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nskr"] Dec 05 20:26:45 crc kubenswrapper[4904]: I1205 20:26:45.691169 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerStarted","Data":"a6c0957e05aed907286a856895597f9ae3c7f284fa1bb02c60326551c7f675b0"} Dec 05 20:26:45 crc kubenswrapper[4904]: I1205 20:26:45.712420 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" podStartSLOduration=4.380625873 podStartE2EDuration="7.71239747s" podCreationTimestamp="2025-12-05 20:26:38 +0000 UTC" firstStartedPulling="2025-12-05 20:26:39.625664181 +0000 UTC m=+898.436880290" lastFinishedPulling="2025-12-05 20:26:42.957435778 +0000 UTC m=+901.768651887" observedRunningTime="2025-12-05 20:26:45.708296378 +0000 UTC m=+904.519512567" watchObservedRunningTime="2025-12-05 20:26:45.71239747 +0000 UTC m=+904.523613579" Dec 05 20:26:46 crc kubenswrapper[4904]: I1205 20:26:46.697112 4904 generic.go:334] "Generic (PLEG): container finished" podID="074d2553-b276-41f9-ae52-d37209d033e3" containerID="0581a9f995053294b630986e81eaf551ac4c89a594325a20b6271fe4a7263a7a" exitCode=0 Dec 05 20:26:46 crc kubenswrapper[4904]: I1205 20:26:46.697173 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" event={"ID":"074d2553-b276-41f9-ae52-d37209d033e3","Type":"ContainerDied","Data":"0581a9f995053294b630986e81eaf551ac4c89a594325a20b6271fe4a7263a7a"} Dec 05 20:26:46 crc kubenswrapper[4904]: I1205 20:26:46.699478 4904 generic.go:334] "Generic (PLEG): container finished" podID="d422a85a-3a49-455b-bb05-4c931c24a960" containerID="ec5fb01e4aa6b028ffd394509b14d406c57c2633710098d3bebd2362ed081c29" exitCode=0 Dec 05 20:26:46 crc kubenswrapper[4904]: I1205 20:26:46.699538 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerDied","Data":"ec5fb01e4aa6b028ffd394509b14d406c57c2633710098d3bebd2362ed081c29"} Dec 05 20:26:47 crc kubenswrapper[4904]: I1205 20:26:47.708264 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerStarted","Data":"01787c1a9de9fcf8bf198e9ca548bb10f1086b6ab388e3f3952b211a82f56c36"} Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.479541 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.637764 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfkzx\" (UniqueName: \"kubernetes.io/projected/074d2553-b276-41f9-ae52-d37209d033e3-kube-api-access-sfkzx\") pod \"074d2553-b276-41f9-ae52-d37209d033e3\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.637898 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-util\") pod \"074d2553-b276-41f9-ae52-d37209d033e3\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.637927 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-bundle\") pod \"074d2553-b276-41f9-ae52-d37209d033e3\" (UID: \"074d2553-b276-41f9-ae52-d37209d033e3\") " Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.639130 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-bundle" (OuterVolumeSpecName: "bundle") pod "074d2553-b276-41f9-ae52-d37209d033e3" (UID: "074d2553-b276-41f9-ae52-d37209d033e3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.643288 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074d2553-b276-41f9-ae52-d37209d033e3-kube-api-access-sfkzx" (OuterVolumeSpecName: "kube-api-access-sfkzx") pod "074d2553-b276-41f9-ae52-d37209d033e3" (UID: "074d2553-b276-41f9-ae52-d37209d033e3"). InnerVolumeSpecName "kube-api-access-sfkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.648418 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-util" (OuterVolumeSpecName: "util") pod "074d2553-b276-41f9-ae52-d37209d033e3" (UID: "074d2553-b276-41f9-ae52-d37209d033e3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.720986 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" event={"ID":"074d2553-b276-41f9-ae52-d37209d033e3","Type":"ContainerDied","Data":"2ac04e87f13d4bdc10ee17924598fc655d3ff2f1b563b8ad8b77840123c99e56"} Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.721009 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.721032 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ac04e87f13d4bdc10ee17924598fc655d3ff2f1b563b8ad8b77840123c99e56" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.722713 4904 generic.go:334] "Generic (PLEG): container finished" podID="d422a85a-3a49-455b-bb05-4c931c24a960" containerID="01787c1a9de9fcf8bf198e9ca548bb10f1086b6ab388e3f3952b211a82f56c36" exitCode=0 Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.722754 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerDied","Data":"01787c1a9de9fcf8bf198e9ca548bb10f1086b6ab388e3f3952b211a82f56c36"} Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.739277 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfkzx\" (UniqueName: \"kubernetes.io/projected/074d2553-b276-41f9-ae52-d37209d033e3-kube-api-access-sfkzx\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.739313 4904 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:48 crc kubenswrapper[4904]: I1205 20:26:48.739325 4904 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074d2553-b276-41f9-ae52-d37209d033e3-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:49 crc kubenswrapper[4904]: I1205 20:26:49.730715 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerStarted","Data":"eef7b08ebb2f52030424f0c18d841ee96e0b41739bbb8512183892cb9c94efbc"} Dec 05 20:26:49 crc kubenswrapper[4904]: I1205 20:26:49.777514 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8nskr" podStartSLOduration=3.238759867 podStartE2EDuration="5.777495138s" podCreationTimestamp="2025-12-05 20:26:44 +0000 UTC" firstStartedPulling="2025-12-05 20:26:46.701039614 +0000 UTC m=+905.512255743" lastFinishedPulling="2025-12-05 20:26:49.239774865 +0000 UTC m=+908.050991014" observedRunningTime="2025-12-05 20:26:49.768959793 +0000 UTC m=+908.580175902" watchObservedRunningTime="2025-12-05 20:26:49.777495138 +0000 UTC m=+908.588711247" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.253652 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjm4x"] Dec 05 20:26:54 crc kubenswrapper[4904]: E1205 20:26:54.254231 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="pull" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.254248 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="pull" Dec 05 20:26:54 crc kubenswrapper[4904]: E1205 20:26:54.254273 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="util" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.254280 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="util" Dec 05 20:26:54 crc kubenswrapper[4904]: E1205 20:26:54.254294 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="extract" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.254301 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="extract" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.254422 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="074d2553-b276-41f9-ae52-d37209d033e3" containerName="extract" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.255369 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.265330 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjm4x"] Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.379210 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.379278 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.409230 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-catalog-content\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.409383 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bpzc\" (UniqueName: \"kubernetes.io/projected/e0060183-15f1-4cb6-ab22-b171c9fc86ee-kube-api-access-8bpzc\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.409428 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-utilities\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.481080 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.511213 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-catalog-content\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.511406 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bpzc\" (UniqueName: \"kubernetes.io/projected/e0060183-15f1-4cb6-ab22-b171c9fc86ee-kube-api-access-8bpzc\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.511464 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-utilities\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.511808 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-catalog-content\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.512012 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-utilities\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.547637 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bpzc\" (UniqueName: \"kubernetes.io/projected/e0060183-15f1-4cb6-ab22-b171c9fc86ee-kube-api-access-8bpzc\") pod \"redhat-marketplace-mjm4x\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.572924 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.874685 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.916927 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjm4x"] Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.989130 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk"] Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.990041 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.993029 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.993284 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.993528 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.993684 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 20:26:54 crc kubenswrapper[4904]: I1205 20:26:54.994091 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-24krz" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.011396 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk"] Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.018374 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvn9\" (UniqueName: \"kubernetes.io/projected/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-kube-api-access-tjvn9\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.018423 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-apiservice-cert\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.018482 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-webhook-cert\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.119078 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvn9\" (UniqueName: \"kubernetes.io/projected/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-kube-api-access-tjvn9\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.119125 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-apiservice-cert\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.119187 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-webhook-cert\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.124631 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-webhook-cert\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.124762 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-apiservice-cert\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.143926 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvn9\" (UniqueName: \"kubernetes.io/projected/54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28-kube-api-access-tjvn9\") pod \"metallb-operator-controller-manager-58cc54b6b6-b77qk\" (UID: \"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28\") " pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.315094 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.638012 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8"] Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.638902 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.641916 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.642870 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xx6pg" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.645111 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.667299 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8"] Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.724289 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2230230c-8a27-44a4-a63a-219e0e40f288-webhook-cert\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.724361 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjv2k\" (UniqueName: \"kubernetes.io/projected/2230230c-8a27-44a4-a63a-219e0e40f288-kube-api-access-jjv2k\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.724387 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2230230c-8a27-44a4-a63a-219e0e40f288-apiservice-cert\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.777044 4904 generic.go:334] "Generic (PLEG): container finished" podID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerID="b12e36d6eb8472808ef8de2d5fe5924faa3594995d334de9bed5f7e8de0dc201" exitCode=0 Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.777096 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjm4x" event={"ID":"e0060183-15f1-4cb6-ab22-b171c9fc86ee","Type":"ContainerDied","Data":"b12e36d6eb8472808ef8de2d5fe5924faa3594995d334de9bed5f7e8de0dc201"} Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.777141 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjm4x" event={"ID":"e0060183-15f1-4cb6-ab22-b171c9fc86ee","Type":"ContainerStarted","Data":"a6b3dd2d1a53a3b5fd2335deb0afb19900460651f2ea6772f4742d6247471903"} Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.825707 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2230230c-8a27-44a4-a63a-219e0e40f288-webhook-cert\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.825768 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjv2k\" (UniqueName: \"kubernetes.io/projected/2230230c-8a27-44a4-a63a-219e0e40f288-kube-api-access-jjv2k\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.825800 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2230230c-8a27-44a4-a63a-219e0e40f288-apiservice-cert\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.831628 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2230230c-8a27-44a4-a63a-219e0e40f288-apiservice-cert\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.831681 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2230230c-8a27-44a4-a63a-219e0e40f288-webhook-cert\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.843473 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjv2k\" (UniqueName: \"kubernetes.io/projected/2230230c-8a27-44a4-a63a-219e0e40f288-kube-api-access-jjv2k\") pod \"metallb-operator-webhook-server-5595c7bb55-zxmp8\" (UID: \"2230230c-8a27-44a4-a63a-219e0e40f288\") " pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.874850 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk"] Dec 05 20:26:55 crc kubenswrapper[4904]: W1205 20:26:55.883370 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d3f8f4_a4d9_4f3c_b923_f64d6c4f1b28.slice/crio-98e21287ec233f384a2b9eda64dd32a68339bda0ac7b491a01bab5e73643c2b6 WatchSource:0}: Error finding container 98e21287ec233f384a2b9eda64dd32a68339bda0ac7b491a01bab5e73643c2b6: Status 404 returned error can't find the container with id 98e21287ec233f384a2b9eda64dd32a68339bda0ac7b491a01bab5e73643c2b6 Dec 05 20:26:55 crc kubenswrapper[4904]: I1205 20:26:55.977016 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:26:56 crc kubenswrapper[4904]: I1205 20:26:56.418638 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8"] Dec 05 20:26:56 crc kubenswrapper[4904]: W1205 20:26:56.425931 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2230230c_8a27_44a4_a63a_219e0e40f288.slice/crio-aff115ead59ccfe7b8821d13a5f688397e2429e5546530336a7cf8f4ad8e54e3 WatchSource:0}: Error finding container aff115ead59ccfe7b8821d13a5f688397e2429e5546530336a7cf8f4ad8e54e3: Status 404 returned error can't find the container with id aff115ead59ccfe7b8821d13a5f688397e2429e5546530336a7cf8f4ad8e54e3 Dec 05 20:26:56 crc kubenswrapper[4904]: I1205 20:26:56.783826 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" event={"ID":"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28","Type":"ContainerStarted","Data":"98e21287ec233f384a2b9eda64dd32a68339bda0ac7b491a01bab5e73643c2b6"} Dec 05 20:26:56 crc kubenswrapper[4904]: I1205 20:26:56.786463 4904 generic.go:334] "Generic (PLEG): container finished" podID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerID="00d0aea0560a77b24b88f3e6f513b81e8d633e9b810a1c564b77995a7c8002c3" exitCode=0 Dec 05 20:26:56 crc kubenswrapper[4904]: I1205 20:26:56.786548 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjm4x" event={"ID":"e0060183-15f1-4cb6-ab22-b171c9fc86ee","Type":"ContainerDied","Data":"00d0aea0560a77b24b88f3e6f513b81e8d633e9b810a1c564b77995a7c8002c3"} Dec 05 20:26:56 crc kubenswrapper[4904]: I1205 20:26:56.788523 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" event={"ID":"2230230c-8a27-44a4-a63a-219e0e40f288","Type":"ContainerStarted","Data":"aff115ead59ccfe7b8821d13a5f688397e2429e5546530336a7cf8f4ad8e54e3"} Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.460679 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qf2xk"] Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.462374 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.490097 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qf2xk"] Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.649229 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4912f2a7-ae28-4ec6-a674-813c38c327c0-utilities\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.649325 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4912f2a7-ae28-4ec6-a674-813c38c327c0-catalog-content\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.649350 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/4912f2a7-ae28-4ec6-a674-813c38c327c0-kube-api-access-cm6pv\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.750581 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4912f2a7-ae28-4ec6-a674-813c38c327c0-catalog-content\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.750636 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/4912f2a7-ae28-4ec6-a674-813c38c327c0-kube-api-access-cm6pv\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.750693 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4912f2a7-ae28-4ec6-a674-813c38c327c0-utilities\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.751257 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4912f2a7-ae28-4ec6-a674-813c38c327c0-catalog-content\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.751276 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4912f2a7-ae28-4ec6-a674-813c38c327c0-utilities\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.798886 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/4912f2a7-ae28-4ec6-a674-813c38c327c0-kube-api-access-cm6pv\") pod \"certified-operators-qf2xk\" (UID: \"4912f2a7-ae28-4ec6-a674-813c38c327c0\") " pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.811088 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjm4x" event={"ID":"e0060183-15f1-4cb6-ab22-b171c9fc86ee","Type":"ContainerStarted","Data":"71ec3594005ebf3c2cede5777c15833f5cc6a5813d417124ff602d12449f0473"} Dec 05 20:26:57 crc kubenswrapper[4904]: I1205 20:26:57.858537 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjm4x" podStartSLOduration=2.35017249 podStartE2EDuration="3.858522041s" podCreationTimestamp="2025-12-05 20:26:54 +0000 UTC" firstStartedPulling="2025-12-05 20:26:55.778874946 +0000 UTC m=+914.590091055" lastFinishedPulling="2025-12-05 20:26:57.287224497 +0000 UTC m=+916.098440606" observedRunningTime="2025-12-05 20:26:57.856965529 +0000 UTC m=+916.668181648" watchObservedRunningTime="2025-12-05 20:26:57.858522041 +0000 UTC m=+916.669738150" Dec 05 20:26:58 crc kubenswrapper[4904]: I1205 20:26:58.080687 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:26:58 crc kubenswrapper[4904]: I1205 20:26:58.458981 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nskr"] Dec 05 20:26:58 crc kubenswrapper[4904]: I1205 20:26:58.459547 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8nskr" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="registry-server" containerID="cri-o://eef7b08ebb2f52030424f0c18d841ee96e0b41739bbb8512183892cb9c94efbc" gracePeriod=2 Dec 05 20:26:58 crc kubenswrapper[4904]: I1205 20:26:58.789663 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qf2xk"] Dec 05 20:26:58 crc kubenswrapper[4904]: I1205 20:26:58.820663 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf2xk" event={"ID":"4912f2a7-ae28-4ec6-a674-813c38c327c0","Type":"ContainerStarted","Data":"fe68af448952614fabd37ddd76ca920ba1d711821340652183aaf5f61a6adc88"} Dec 05 20:26:59 crc kubenswrapper[4904]: I1205 20:26:59.826308 4904 generic.go:334] "Generic (PLEG): container finished" podID="4912f2a7-ae28-4ec6-a674-813c38c327c0" containerID="c1416d2d4dc34de5e0502818e9dfbda523a194a23fbc55185bfc5592bb620700" exitCode=0 Dec 05 20:26:59 crc kubenswrapper[4904]: I1205 20:26:59.826358 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf2xk" event={"ID":"4912f2a7-ae28-4ec6-a674-813c38c327c0","Type":"ContainerDied","Data":"c1416d2d4dc34de5e0502818e9dfbda523a194a23fbc55185bfc5592bb620700"} Dec 05 20:26:59 crc kubenswrapper[4904]: I1205 20:26:59.828960 4904 generic.go:334] "Generic (PLEG): container finished" podID="d422a85a-3a49-455b-bb05-4c931c24a960" containerID="eef7b08ebb2f52030424f0c18d841ee96e0b41739bbb8512183892cb9c94efbc" exitCode=0 Dec 05 20:26:59 crc kubenswrapper[4904]: I1205 20:26:59.829007 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerDied","Data":"eef7b08ebb2f52030424f0c18d841ee96e0b41739bbb8512183892cb9c94efbc"} Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.514170 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.607838 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-catalog-content\") pod \"d422a85a-3a49-455b-bb05-4c931c24a960\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.607894 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-utilities\") pod \"d422a85a-3a49-455b-bb05-4c931c24a960\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.607983 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mqrz\" (UniqueName: \"kubernetes.io/projected/d422a85a-3a49-455b-bb05-4c931c24a960-kube-api-access-9mqrz\") pod \"d422a85a-3a49-455b-bb05-4c931c24a960\" (UID: \"d422a85a-3a49-455b-bb05-4c931c24a960\") " Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.608881 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-utilities" (OuterVolumeSpecName: "utilities") pod "d422a85a-3a49-455b-bb05-4c931c24a960" (UID: "d422a85a-3a49-455b-bb05-4c931c24a960"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.617450 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d422a85a-3a49-455b-bb05-4c931c24a960-kube-api-access-9mqrz" (OuterVolumeSpecName: "kube-api-access-9mqrz") pod "d422a85a-3a49-455b-bb05-4c931c24a960" (UID: "d422a85a-3a49-455b-bb05-4c931c24a960"). InnerVolumeSpecName "kube-api-access-9mqrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.664518 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d422a85a-3a49-455b-bb05-4c931c24a960" (UID: "d422a85a-3a49-455b-bb05-4c931c24a960"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.709696 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mqrz\" (UniqueName: \"kubernetes.io/projected/d422a85a-3a49-455b-bb05-4c931c24a960-kube-api-access-9mqrz\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.709729 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.709738 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d422a85a-3a49-455b-bb05-4c931c24a960-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.838272 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nskr" event={"ID":"d422a85a-3a49-455b-bb05-4c931c24a960","Type":"ContainerDied","Data":"a6c0957e05aed907286a856895597f9ae3c7f284fa1bb02c60326551c7f675b0"} Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.838319 4904 scope.go:117] "RemoveContainer" containerID="eef7b08ebb2f52030424f0c18d841ee96e0b41739bbb8512183892cb9c94efbc" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.838405 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nskr" Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.866773 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nskr"] Dec 05 20:27:00 crc kubenswrapper[4904]: I1205 20:27:00.872983 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8nskr"] Dec 05 20:27:01 crc kubenswrapper[4904]: I1205 20:27:01.718369 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" path="/var/lib/kubelet/pods/d422a85a-3a49-455b-bb05-4c931c24a960/volumes" Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.160730 4904 scope.go:117] "RemoveContainer" containerID="01787c1a9de9fcf8bf198e9ca548bb10f1086b6ab388e3f3952b211a82f56c36" Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.182302 4904 scope.go:117] "RemoveContainer" containerID="ec5fb01e4aa6b028ffd394509b14d406c57c2633710098d3bebd2362ed081c29" Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.853562 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" event={"ID":"54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28","Type":"ContainerStarted","Data":"53052282b6a1a738091d9687126f56ff0d765c10e99cf2139fe430c9c247233f"} Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.855115 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.868693 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" event={"ID":"2230230c-8a27-44a4-a63a-219e0e40f288","Type":"ContainerStarted","Data":"856edcf8cd969c90830c681e7ad1406cf532c3cb72323b61b66f23ecc89ae268"} Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.868843 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:27:02 crc kubenswrapper[4904]: I1205 20:27:02.878092 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" podStartSLOduration=2.591046049 podStartE2EDuration="8.878033273s" podCreationTimestamp="2025-12-05 20:26:54 +0000 UTC" firstStartedPulling="2025-12-05 20:26:55.886262922 +0000 UTC m=+914.697479031" lastFinishedPulling="2025-12-05 20:27:02.173250146 +0000 UTC m=+920.984466255" observedRunningTime="2025-12-05 20:27:02.871711679 +0000 UTC m=+921.682927798" watchObservedRunningTime="2025-12-05 20:27:02.878033273 +0000 UTC m=+921.689249382" Dec 05 20:27:04 crc kubenswrapper[4904]: I1205 20:27:04.573613 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:27:04 crc kubenswrapper[4904]: I1205 20:27:04.573669 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:27:04 crc kubenswrapper[4904]: I1205 20:27:04.653487 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:27:04 crc kubenswrapper[4904]: I1205 20:27:04.676114 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" podStartSLOduration=3.932394933 podStartE2EDuration="9.676096423s" podCreationTimestamp="2025-12-05 20:26:55 +0000 UTC" firstStartedPulling="2025-12-05 20:26:56.44132931 +0000 UTC m=+915.252545419" lastFinishedPulling="2025-12-05 20:27:02.1850308 +0000 UTC m=+920.996246909" observedRunningTime="2025-12-05 20:27:02.893814236 +0000 UTC m=+921.705030375" watchObservedRunningTime="2025-12-05 20:27:04.676096423 +0000 UTC m=+923.487312542" Dec 05 20:27:04 crc kubenswrapper[4904]: I1205 20:27:04.959086 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:27:07 crc kubenswrapper[4904]: I1205 20:27:07.914372 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf2xk" event={"ID":"4912f2a7-ae28-4ec6-a674-813c38c327c0","Type":"ContainerStarted","Data":"a5bdd220ab9e2a198b93b67b03c54b9ffba447a94f8da325806cc8892ccb0ebe"} Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.052282 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjm4x"] Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.052524 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mjm4x" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="registry-server" containerID="cri-o://71ec3594005ebf3c2cede5777c15833f5cc6a5813d417124ff602d12449f0473" gracePeriod=2 Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.934453 4904 generic.go:334] "Generic (PLEG): container finished" podID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerID="71ec3594005ebf3c2cede5777c15833f5cc6a5813d417124ff602d12449f0473" exitCode=0 Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.934578 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjm4x" event={"ID":"e0060183-15f1-4cb6-ab22-b171c9fc86ee","Type":"ContainerDied","Data":"71ec3594005ebf3c2cede5777c15833f5cc6a5813d417124ff602d12449f0473"} Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.938021 4904 generic.go:334] "Generic (PLEG): container finished" podID="4912f2a7-ae28-4ec6-a674-813c38c327c0" containerID="a5bdd220ab9e2a198b93b67b03c54b9ffba447a94f8da325806cc8892ccb0ebe" exitCode=0 Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.938090 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf2xk" event={"ID":"4912f2a7-ae28-4ec6-a674-813c38c327c0","Type":"ContainerDied","Data":"a5bdd220ab9e2a198b93b67b03c54b9ffba447a94f8da325806cc8892ccb0ebe"} Dec 05 20:27:08 crc kubenswrapper[4904]: I1205 20:27:08.989637 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.147913 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-utilities\") pod \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.148000 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bpzc\" (UniqueName: \"kubernetes.io/projected/e0060183-15f1-4cb6-ab22-b171c9fc86ee-kube-api-access-8bpzc\") pod \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.148074 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-catalog-content\") pod \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\" (UID: \"e0060183-15f1-4cb6-ab22-b171c9fc86ee\") " Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.148802 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-utilities" (OuterVolumeSpecName: "utilities") pod "e0060183-15f1-4cb6-ab22-b171c9fc86ee" (UID: "e0060183-15f1-4cb6-ab22-b171c9fc86ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.161371 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0060183-15f1-4cb6-ab22-b171c9fc86ee-kube-api-access-8bpzc" (OuterVolumeSpecName: "kube-api-access-8bpzc") pod "e0060183-15f1-4cb6-ab22-b171c9fc86ee" (UID: "e0060183-15f1-4cb6-ab22-b171c9fc86ee"). InnerVolumeSpecName "kube-api-access-8bpzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.180373 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0060183-15f1-4cb6-ab22-b171c9fc86ee" (UID: "e0060183-15f1-4cb6-ab22-b171c9fc86ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.249312 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.249803 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bpzc\" (UniqueName: \"kubernetes.io/projected/e0060183-15f1-4cb6-ab22-b171c9fc86ee-kube-api-access-8bpzc\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.249822 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0060183-15f1-4cb6-ab22-b171c9fc86ee-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.945218 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf2xk" event={"ID":"4912f2a7-ae28-4ec6-a674-813c38c327c0","Type":"ContainerStarted","Data":"9a62531966760a2293c37f8c5649f0273ba224f2a358185674f592c9025d7be8"} Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.947648 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjm4x" event={"ID":"e0060183-15f1-4cb6-ab22-b171c9fc86ee","Type":"ContainerDied","Data":"a6b3dd2d1a53a3b5fd2335deb0afb19900460651f2ea6772f4742d6247471903"} Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.947704 4904 scope.go:117] "RemoveContainer" containerID="71ec3594005ebf3c2cede5777c15833f5cc6a5813d417124ff602d12449f0473" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.947704 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjm4x" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.967508 4904 scope.go:117] "RemoveContainer" containerID="00d0aea0560a77b24b88f3e6f513b81e8d633e9b810a1c564b77995a7c8002c3" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.969151 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qf2xk" podStartSLOduration=4.03655872 podStartE2EDuration="12.969133089s" podCreationTimestamp="2025-12-05 20:26:57 +0000 UTC" firstStartedPulling="2025-12-05 20:27:00.454802311 +0000 UTC m=+919.266018420" lastFinishedPulling="2025-12-05 20:27:09.38737667 +0000 UTC m=+928.198592789" observedRunningTime="2025-12-05 20:27:09.96803206 +0000 UTC m=+928.779248179" watchObservedRunningTime="2025-12-05 20:27:09.969133089 +0000 UTC m=+928.780349208" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.983361 4904 scope.go:117] "RemoveContainer" containerID="b12e36d6eb8472808ef8de2d5fe5924faa3594995d334de9bed5f7e8de0dc201" Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.989984 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjm4x"] Dec 05 20:27:09 crc kubenswrapper[4904]: I1205 20:27:09.996532 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjm4x"] Dec 05 20:27:11 crc kubenswrapper[4904]: I1205 20:27:11.689571 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" path="/var/lib/kubelet/pods/e0060183-15f1-4cb6-ab22-b171c9fc86ee/volumes" Dec 05 20:27:15 crc kubenswrapper[4904]: I1205 20:27:15.982604 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5595c7bb55-zxmp8" Dec 05 20:27:18 crc kubenswrapper[4904]: I1205 20:27:18.081199 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:27:18 crc kubenswrapper[4904]: I1205 20:27:18.082276 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:27:18 crc kubenswrapper[4904]: I1205 20:27:18.128744 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:27:19 crc kubenswrapper[4904]: I1205 20:27:19.066590 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qf2xk" Dec 05 20:27:20 crc kubenswrapper[4904]: I1205 20:27:20.071132 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qf2xk"] Dec 05 20:27:20 crc kubenswrapper[4904]: I1205 20:27:20.456353 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqt9x"] Dec 05 20:27:20 crc kubenswrapper[4904]: I1205 20:27:20.456796 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqt9x" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="registry-server" containerID="cri-o://5bd32f9bfa163e888187fe00aa1835fd8bdfeceb814bc551033da444839bd1a7" gracePeriod=2 Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.016650 4904 generic.go:334] "Generic (PLEG): container finished" podID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerID="5bd32f9bfa163e888187fe00aa1835fd8bdfeceb814bc551033da444839bd1a7" exitCode=0 Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.017575 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqt9x" event={"ID":"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f","Type":"ContainerDied","Data":"5bd32f9bfa163e888187fe00aa1835fd8bdfeceb814bc551033da444839bd1a7"} Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.361335 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.517304 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-utilities\") pod \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.517449 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzgs2\" (UniqueName: \"kubernetes.io/projected/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-kube-api-access-kzgs2\") pod \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.517549 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-catalog-content\") pod \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\" (UID: \"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f\") " Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.518048 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-utilities" (OuterVolumeSpecName: "utilities") pod "e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" (UID: "e9e46cf7-6d81-4d8f-8fa8-d56a6555272f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.526077 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-kube-api-access-kzgs2" (OuterVolumeSpecName: "kube-api-access-kzgs2") pod "e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" (UID: "e9e46cf7-6d81-4d8f-8fa8-d56a6555272f"). InnerVolumeSpecName "kube-api-access-kzgs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.574526 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" (UID: "e9e46cf7-6d81-4d8f-8fa8-d56a6555272f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.619766 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.619815 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:21 crc kubenswrapper[4904]: I1205 20:27:21.619836 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzgs2\" (UniqueName: \"kubernetes.io/projected/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f-kube-api-access-kzgs2\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.027228 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqt9x" Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.027809 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqt9x" event={"ID":"e9e46cf7-6d81-4d8f-8fa8-d56a6555272f","Type":"ContainerDied","Data":"54b543d9da79a617f13a72e47319845f5956637d2eb90fd8f01b4fe8b7ca78c3"} Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.027855 4904 scope.go:117] "RemoveContainer" containerID="5bd32f9bfa163e888187fe00aa1835fd8bdfeceb814bc551033da444839bd1a7" Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.045769 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqt9x"] Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.056612 4904 scope.go:117] "RemoveContainer" containerID="44a54eadb952c71b1b3f2cab8cbd8cd95f2c49b11d4e46810c3a82f0ae6707a0" Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.056771 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqt9x"] Dec 05 20:27:22 crc kubenswrapper[4904]: I1205 20:27:22.084206 4904 scope.go:117] "RemoveContainer" containerID="afdd49f1e531d44b338fab66b0dce97ee2deae7160fb65606c98aead9171d6f8" Dec 05 20:27:23 crc kubenswrapper[4904]: I1205 20:27:23.696535 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" path="/var/lib/kubelet/pods/e9e46cf7-6d81-4d8f-8fa8-d56a6555272f/volumes" Dec 05 20:27:35 crc kubenswrapper[4904]: I1205 20:27:35.321098 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58cc54b6b6-b77qk" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.121601 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hz9g9"] Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.121984 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="extract-utilities" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122011 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="extract-utilities" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122029 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122041 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122078 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="extract-utilities" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122091 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="extract-utilities" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122123 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122136 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122166 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="extract-content" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122179 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="extract-content" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122201 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="extract-content" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122213 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="extract-content" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122235 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="extract-content" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122248 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="extract-content" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122266 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122278 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.122295 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="extract-utilities" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122306 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="extract-utilities" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122520 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d422a85a-3a49-455b-bb05-4c931c24a960" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122547 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0060183-15f1-4cb6-ab22-b171c9fc86ee" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.122576 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e46cf7-6d81-4d8f-8fa8-d56a6555272f" containerName="registry-server" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.126238 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.128650 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.129019 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.129154 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-z48bs" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.131199 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq"] Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.132112 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.138486 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.162108 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq"] Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.227346 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mvkbj"] Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.228864 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.231943 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.232185 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.233638 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-w2cnr" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.233760 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.237026 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-v26v8"] Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.240174 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.241881 4904 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.260162 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-v26v8"] Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327104 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-frr-sockets\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327173 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-frr-conf\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327200 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f78e03d-db28-4ba2-830f-189c97051d36-metrics-certs\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327228 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-reloader\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327251 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7fd88f-360a-4c89-8740-069bc371b65b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dpnqq\" (UID: \"3b7fd88f-360a-4c89-8740-069bc371b65b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327267 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxsvd\" (UniqueName: \"kubernetes.io/projected/1f78e03d-db28-4ba2-830f-189c97051d36-kube-api-access-mxsvd\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327287 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1f78e03d-db28-4ba2-830f-189c97051d36-frr-startup\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327304 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-metrics\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.327319 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcptd\" (UniqueName: \"kubernetes.io/projected/3b7fd88f-360a-4c89-8740-069bc371b65b-kube-api-access-kcptd\") pod \"frr-k8s-webhook-server-7fcb986d4-dpnqq\" (UID: \"3b7fd88f-360a-4c89-8740-069bc371b65b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429027 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2125e0a4-9809-42b4-911f-08c6d2e74879-cert\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429085 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvhk\" (UniqueName: \"kubernetes.io/projected/2125e0a4-9809-42b4-911f-08c6d2e74879-kube-api-access-9nvhk\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429120 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f78e03d-db28-4ba2-830f-189c97051d36-metrics-certs\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429149 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/080b347a-f590-47cf-909f-578330838c1d-metallb-excludel2\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429173 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-metrics-certs\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429219 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-reloader\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429244 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7fd88f-360a-4c89-8740-069bc371b65b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dpnqq\" (UID: \"3b7fd88f-360a-4c89-8740-069bc371b65b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429258 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2125e0a4-9809-42b4-911f-08c6d2e74879-metrics-certs\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429274 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxsvd\" (UniqueName: \"kubernetes.io/projected/1f78e03d-db28-4ba2-830f-189c97051d36-kube-api-access-mxsvd\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429291 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429311 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1f78e03d-db28-4ba2-830f-189c97051d36-frr-startup\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429328 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xkmz\" (UniqueName: \"kubernetes.io/projected/080b347a-f590-47cf-909f-578330838c1d-kube-api-access-8xkmz\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429344 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-metrics\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429360 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcptd\" (UniqueName: \"kubernetes.io/projected/3b7fd88f-360a-4c89-8740-069bc371b65b-kube-api-access-kcptd\") pod \"frr-k8s-webhook-server-7fcb986d4-dpnqq\" (UID: \"3b7fd88f-360a-4c89-8740-069bc371b65b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429390 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-frr-sockets\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429416 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-frr-conf\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.429897 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-frr-conf\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.430428 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-metrics\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.430589 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1f78e03d-db28-4ba2-830f-189c97051d36-frr-startup\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.430790 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-frr-sockets\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.431182 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1f78e03d-db28-4ba2-830f-189c97051d36-reloader\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.435478 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f78e03d-db28-4ba2-830f-189c97051d36-metrics-certs\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.436666 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7fd88f-360a-4c89-8740-069bc371b65b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dpnqq\" (UID: \"3b7fd88f-360a-4c89-8740-069bc371b65b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.446339 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxsvd\" (UniqueName: \"kubernetes.io/projected/1f78e03d-db28-4ba2-830f-189c97051d36-kube-api-access-mxsvd\") pod \"frr-k8s-hz9g9\" (UID: \"1f78e03d-db28-4ba2-830f-189c97051d36\") " pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.461600 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcptd\" (UniqueName: \"kubernetes.io/projected/3b7fd88f-360a-4c89-8740-069bc371b65b-kube-api-access-kcptd\") pod \"frr-k8s-webhook-server-7fcb986d4-dpnqq\" (UID: \"3b7fd88f-360a-4c89-8740-069bc371b65b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.476578 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.512303 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531223 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2125e0a4-9809-42b4-911f-08c6d2e74879-cert\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531279 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvhk\" (UniqueName: \"kubernetes.io/projected/2125e0a4-9809-42b4-911f-08c6d2e74879-kube-api-access-9nvhk\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531350 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/080b347a-f590-47cf-909f-578330838c1d-metallb-excludel2\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531377 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-metrics-certs\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531414 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2125e0a4-9809-42b4-911f-08c6d2e74879-metrics-certs\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531464 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.531492 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xkmz\" (UniqueName: \"kubernetes.io/projected/080b347a-f590-47cf-909f-578330838c1d-kube-api-access-8xkmz\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.532705 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/080b347a-f590-47cf-909f-578330838c1d-metallb-excludel2\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.532795 4904 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:27:36 crc kubenswrapper[4904]: E1205 20:27:36.532845 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist podName:080b347a-f590-47cf-909f-578330838c1d nodeName:}" failed. No retries permitted until 2025-12-05 20:27:37.032829711 +0000 UTC m=+955.844045830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist") pod "speaker-mvkbj" (UID: "080b347a-f590-47cf-909f-578330838c1d") : secret "metallb-memberlist" not found Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.541609 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2125e0a4-9809-42b4-911f-08c6d2e74879-cert\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.542191 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2125e0a4-9809-42b4-911f-08c6d2e74879-metrics-certs\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.542593 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-metrics-certs\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.558783 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvhk\" (UniqueName: \"kubernetes.io/projected/2125e0a4-9809-42b4-911f-08c6d2e74879-kube-api-access-9nvhk\") pod \"controller-f8648f98b-v26v8\" (UID: \"2125e0a4-9809-42b4-911f-08c6d2e74879\") " pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.559099 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.561684 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xkmz\" (UniqueName: \"kubernetes.io/projected/080b347a-f590-47cf-909f-578330838c1d-kube-api-access-8xkmz\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:36 crc kubenswrapper[4904]: I1205 20:27:36.869919 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-v26v8"] Dec 05 20:27:36 crc kubenswrapper[4904]: W1205 20:27:36.873476 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2125e0a4_9809_42b4_911f_08c6d2e74879.slice/crio-31d108e7e5d9d4f0493f09a5638e14dc7e817fc0fae5b317d4f288f5f853f4c1 WatchSource:0}: Error finding container 31d108e7e5d9d4f0493f09a5638e14dc7e817fc0fae5b317d4f288f5f853f4c1: Status 404 returned error can't find the container with id 31d108e7e5d9d4f0493f09a5638e14dc7e817fc0fae5b317d4f288f5f853f4c1 Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.037699 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq"] Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.039940 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:37 crc kubenswrapper[4904]: E1205 20:27:37.040089 4904 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:27:37 crc kubenswrapper[4904]: E1205 20:27:37.040144 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist podName:080b347a-f590-47cf-909f-578330838c1d nodeName:}" failed. No retries permitted until 2025-12-05 20:27:38.040128979 +0000 UTC m=+956.851345098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist") pod "speaker-mvkbj" (UID: "080b347a-f590-47cf-909f-578330838c1d") : secret "metallb-memberlist" not found Dec 05 20:27:37 crc kubenswrapper[4904]: W1205 20:27:37.051177 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b7fd88f_360a_4c89_8740_069bc371b65b.slice/crio-5a84723382ab2ac17296f8c48fb977416011071958f45f54f8edc6dea1d1a55b WatchSource:0}: Error finding container 5a84723382ab2ac17296f8c48fb977416011071958f45f54f8edc6dea1d1a55b: Status 404 returned error can't find the container with id 5a84723382ab2ac17296f8c48fb977416011071958f45f54f8edc6dea1d1a55b Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.135458 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" event={"ID":"3b7fd88f-360a-4c89-8740-069bc371b65b","Type":"ContainerStarted","Data":"5a84723382ab2ac17296f8c48fb977416011071958f45f54f8edc6dea1d1a55b"} Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.136680 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"ac4c803e04af44e35da621da62612b37b2291412a9799c6f2f3f3eb1106593bc"} Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.138428 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v26v8" event={"ID":"2125e0a4-9809-42b4-911f-08c6d2e74879","Type":"ContainerStarted","Data":"8ab33295ab34e7e5f14059bdd0088b7a8f1efb53b868f22fd9d644eb21d09f35"} Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.138603 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v26v8" event={"ID":"2125e0a4-9809-42b4-911f-08c6d2e74879","Type":"ContainerStarted","Data":"6c25b55ba7fb764870bd036a94890151caa4fe42d913030e91a527197d15a22f"} Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.138696 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v26v8" event={"ID":"2125e0a4-9809-42b4-911f-08c6d2e74879","Type":"ContainerStarted","Data":"31d108e7e5d9d4f0493f09a5638e14dc7e817fc0fae5b317d4f288f5f853f4c1"} Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.138791 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:37 crc kubenswrapper[4904]: I1205 20:27:37.154512 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-v26v8" podStartSLOduration=1.154491306 podStartE2EDuration="1.154491306s" podCreationTimestamp="2025-12-05 20:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:37.152322217 +0000 UTC m=+955.963538336" watchObservedRunningTime="2025-12-05 20:27:37.154491306 +0000 UTC m=+955.965707415" Dec 05 20:27:38 crc kubenswrapper[4904]: I1205 20:27:38.052401 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:38 crc kubenswrapper[4904]: I1205 20:27:38.062430 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/080b347a-f590-47cf-909f-578330838c1d-memberlist\") pod \"speaker-mvkbj\" (UID: \"080b347a-f590-47cf-909f-578330838c1d\") " pod="metallb-system/speaker-mvkbj" Dec 05 20:27:38 crc kubenswrapper[4904]: I1205 20:27:38.352617 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mvkbj" Dec 05 20:27:39 crc kubenswrapper[4904]: I1205 20:27:39.157154 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mvkbj" event={"ID":"080b347a-f590-47cf-909f-578330838c1d","Type":"ContainerStarted","Data":"f352eb3fe1454a3b3384ab3d20958826e85b5bd88aff483788e3e5f74939dfaa"} Dec 05 20:27:39 crc kubenswrapper[4904]: I1205 20:27:39.157466 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mvkbj" event={"ID":"080b347a-f590-47cf-909f-578330838c1d","Type":"ContainerStarted","Data":"e3b7bd824f2caf4b5b6096101b929cbb0163f44a147c276d269de18cfc09eb3b"} Dec 05 20:27:39 crc kubenswrapper[4904]: I1205 20:27:39.157477 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mvkbj" event={"ID":"080b347a-f590-47cf-909f-578330838c1d","Type":"ContainerStarted","Data":"892d5095f5cb5d50bfe4b6941a3ec5f5d450b33a04bcde8b049a93c62a088b4b"} Dec 05 20:27:39 crc kubenswrapper[4904]: I1205 20:27:39.157658 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mvkbj" Dec 05 20:27:39 crc kubenswrapper[4904]: I1205 20:27:39.187624 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mvkbj" podStartSLOduration=3.1875862440000002 podStartE2EDuration="3.187586244s" podCreationTimestamp="2025-12-05 20:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:39.184652134 +0000 UTC m=+957.995868253" watchObservedRunningTime="2025-12-05 20:27:39.187586244 +0000 UTC m=+957.998802373" Dec 05 20:27:46 crc kubenswrapper[4904]: I1205 20:27:46.314107 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" event={"ID":"3b7fd88f-360a-4c89-8740-069bc371b65b","Type":"ContainerStarted","Data":"bce8476ad4ace3bb3b288782434188216d2783d13aa45d95291c1fb37c62258e"} Dec 05 20:27:46 crc kubenswrapper[4904]: I1205 20:27:46.314616 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:27:46 crc kubenswrapper[4904]: I1205 20:27:46.316234 4904 generic.go:334] "Generic (PLEG): container finished" podID="1f78e03d-db28-4ba2-830f-189c97051d36" containerID="f6ab9c5b1a410192358e21778d32bf7d5277ff755e2232635b9d06b75e5e34f3" exitCode=0 Dec 05 20:27:46 crc kubenswrapper[4904]: I1205 20:27:46.316334 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerDied","Data":"f6ab9c5b1a410192358e21778d32bf7d5277ff755e2232635b9d06b75e5e34f3"} Dec 05 20:27:46 crc kubenswrapper[4904]: I1205 20:27:46.335972 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" podStartSLOduration=1.435370202 podStartE2EDuration="10.335959792s" podCreationTimestamp="2025-12-05 20:27:36 +0000 UTC" firstStartedPulling="2025-12-05 20:27:37.053838765 +0000 UTC m=+955.865054874" lastFinishedPulling="2025-12-05 20:27:45.954428355 +0000 UTC m=+964.765644464" observedRunningTime="2025-12-05 20:27:46.335952962 +0000 UTC m=+965.147169081" watchObservedRunningTime="2025-12-05 20:27:46.335959792 +0000 UTC m=+965.147175911" Dec 05 20:27:46 crc kubenswrapper[4904]: I1205 20:27:46.563334 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-v26v8" Dec 05 20:27:47 crc kubenswrapper[4904]: I1205 20:27:47.331447 4904 generic.go:334] "Generic (PLEG): container finished" podID="1f78e03d-db28-4ba2-830f-189c97051d36" containerID="bae5ca74e4e8e1ecae9b574dcaf8fa58a5a9fcd8ed9490b7a8a9fd5ee5bde37c" exitCode=0 Dec 05 20:27:47 crc kubenswrapper[4904]: I1205 20:27:47.331576 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerDied","Data":"bae5ca74e4e8e1ecae9b574dcaf8fa58a5a9fcd8ed9490b7a8a9fd5ee5bde37c"} Dec 05 20:27:48 crc kubenswrapper[4904]: I1205 20:27:48.341350 4904 generic.go:334] "Generic (PLEG): container finished" podID="1f78e03d-db28-4ba2-830f-189c97051d36" containerID="cc3895683fcd4856f8641a1cad1f1c813d67134abfddab954fbcc565c1fe35df" exitCode=0 Dec 05 20:27:48 crc kubenswrapper[4904]: I1205 20:27:48.341446 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerDied","Data":"cc3895683fcd4856f8641a1cad1f1c813d67134abfddab954fbcc565c1fe35df"} Dec 05 20:27:48 crc kubenswrapper[4904]: I1205 20:27:48.355729 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mvkbj" Dec 05 20:27:49 crc kubenswrapper[4904]: I1205 20:27:49.351770 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"3fca3f060cc5dd28e967f310d9c3e99e91dde7e20618cff487a3e4cbbe07c66f"} Dec 05 20:27:49 crc kubenswrapper[4904]: I1205 20:27:49.352004 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"abd93a2149b3c9100ae357b782f65f2ee3740ad1e9c8b9f13db1918259c0cb63"} Dec 05 20:27:49 crc kubenswrapper[4904]: I1205 20:27:49.352016 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"2a21fe90f031ea2ed7cf477dee1a4c2cf3981df2e374824ec6d66c9c5049bf8d"} Dec 05 20:27:49 crc kubenswrapper[4904]: I1205 20:27:49.352024 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"616d51f55cbfd10501208b63b13d01e4ca93ec9dc66832b1887e649a4cbc5a31"} Dec 05 20:27:50 crc kubenswrapper[4904]: I1205 20:27:50.361177 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"5d0bf7864917069ee9c4737edff6b37e7122784e105dbee82870fd7726565881"} Dec 05 20:27:50 crc kubenswrapper[4904]: I1205 20:27:50.361235 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hz9g9" event={"ID":"1f78e03d-db28-4ba2-830f-189c97051d36","Type":"ContainerStarted","Data":"f88fb84ae215384cd6ffff2ab44fdc3130fac4b909e11af5be5e61bf17433f80"} Dec 05 20:27:50 crc kubenswrapper[4904]: I1205 20:27:50.361449 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:50 crc kubenswrapper[4904]: I1205 20:27:50.387258 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hz9g9" podStartSLOduration=5.392795805 podStartE2EDuration="14.3872414s" podCreationTimestamp="2025-12-05 20:27:36 +0000 UTC" firstStartedPulling="2025-12-05 20:27:36.93631368 +0000 UTC m=+955.747529789" lastFinishedPulling="2025-12-05 20:27:45.930759255 +0000 UTC m=+964.741975384" observedRunningTime="2025-12-05 20:27:50.38361358 +0000 UTC m=+969.194829759" watchObservedRunningTime="2025-12-05 20:27:50.3872414 +0000 UTC m=+969.198457509" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.477438 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.546205 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.652197 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kc5m7"] Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.653087 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.655766 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.656086 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-q2c9b" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.657449 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.715901 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kc5m7"] Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.752681 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwng\" (UniqueName: \"kubernetes.io/projected/6b8dcb6d-00ed-4247-b4c8-5e5964bc0513-kube-api-access-gxwng\") pod \"openstack-operator-index-kc5m7\" (UID: \"6b8dcb6d-00ed-4247-b4c8-5e5964bc0513\") " pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.853816 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwng\" (UniqueName: \"kubernetes.io/projected/6b8dcb6d-00ed-4247-b4c8-5e5964bc0513-kube-api-access-gxwng\") pod \"openstack-operator-index-kc5m7\" (UID: \"6b8dcb6d-00ed-4247-b4c8-5e5964bc0513\") " pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.871834 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwng\" (UniqueName: \"kubernetes.io/projected/6b8dcb6d-00ed-4247-b4c8-5e5964bc0513-kube-api-access-gxwng\") pod \"openstack-operator-index-kc5m7\" (UID: \"6b8dcb6d-00ed-4247-b4c8-5e5964bc0513\") " pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:27:51 crc kubenswrapper[4904]: I1205 20:27:51.967401 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:27:52 crc kubenswrapper[4904]: I1205 20:27:52.437201 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kc5m7"] Dec 05 20:27:52 crc kubenswrapper[4904]: W1205 20:27:52.441846 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8dcb6d_00ed_4247_b4c8_5e5964bc0513.slice/crio-1f8cc40a7a58ca28c700116cd11d0d10f65a5a721a61f3fda174aa72eb26d6e5 WatchSource:0}: Error finding container 1f8cc40a7a58ca28c700116cd11d0d10f65a5a721a61f3fda174aa72eb26d6e5: Status 404 returned error can't find the container with id 1f8cc40a7a58ca28c700116cd11d0d10f65a5a721a61f3fda174aa72eb26d6e5 Dec 05 20:27:53 crc kubenswrapper[4904]: I1205 20:27:53.381421 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kc5m7" event={"ID":"6b8dcb6d-00ed-4247-b4c8-5e5964bc0513","Type":"ContainerStarted","Data":"1f8cc40a7a58ca28c700116cd11d0d10f65a5a721a61f3fda174aa72eb26d6e5"} Dec 05 20:27:56 crc kubenswrapper[4904]: I1205 20:27:56.401379 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kc5m7" event={"ID":"6b8dcb6d-00ed-4247-b4c8-5e5964bc0513","Type":"ContainerStarted","Data":"f486ff8d7976cf37b40c4f02232af4971c43b087a1d944131349025691d90459"} Dec 05 20:27:56 crc kubenswrapper[4904]: I1205 20:27:56.418193 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kc5m7" podStartSLOduration=2.581178662 podStartE2EDuration="5.418167186s" podCreationTimestamp="2025-12-05 20:27:51 +0000 UTC" firstStartedPulling="2025-12-05 20:27:52.44376023 +0000 UTC m=+971.254976339" lastFinishedPulling="2025-12-05 20:27:55.280748734 +0000 UTC m=+974.091964863" observedRunningTime="2025-12-05 20:27:56.414419392 +0000 UTC m=+975.225635541" watchObservedRunningTime="2025-12-05 20:27:56.418167186 +0000 UTC m=+975.229383325" Dec 05 20:27:56 crc kubenswrapper[4904]: I1205 20:27:56.518166 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dpnqq" Dec 05 20:28:01 crc kubenswrapper[4904]: I1205 20:28:01.967999 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:28:01 crc kubenswrapper[4904]: I1205 20:28:01.968315 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:28:02 crc kubenswrapper[4904]: I1205 20:28:02.007353 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:28:02 crc kubenswrapper[4904]: I1205 20:28:02.496993 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-kc5m7" Dec 05 20:28:06 crc kubenswrapper[4904]: I1205 20:28:06.479542 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hz9g9" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.056635 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27"] Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.058107 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.060027 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pwl65" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.075893 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27"] Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.076413 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97pp2\" (UniqueName: \"kubernetes.io/projected/6cab1a63-a096-409d-9ca0-3308b4d6b434-kube-api-access-97pp2\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.076512 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.076586 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.177170 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97pp2\" (UniqueName: \"kubernetes.io/projected/6cab1a63-a096-409d-9ca0-3308b4d6b434-kube-api-access-97pp2\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.177232 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.177270 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.177751 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.177905 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.195030 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97pp2\" (UniqueName: \"kubernetes.io/projected/6cab1a63-a096-409d-9ca0-3308b4d6b434-kube-api-access-97pp2\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.422910 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:08 crc kubenswrapper[4904]: I1205 20:28:08.709841 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27"] Dec 05 20:28:08 crc kubenswrapper[4904]: W1205 20:28:08.717793 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cab1a63_a096_409d_9ca0_3308b4d6b434.slice/crio-064643383fb210493813320d785851d1a654e881f10f770d135c0fb6e0ba2f54 WatchSource:0}: Error finding container 064643383fb210493813320d785851d1a654e881f10f770d135c0fb6e0ba2f54: Status 404 returned error can't find the container with id 064643383fb210493813320d785851d1a654e881f10f770d135c0fb6e0ba2f54 Dec 05 20:28:09 crc kubenswrapper[4904]: I1205 20:28:09.515342 4904 generic.go:334] "Generic (PLEG): container finished" podID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerID="fade0c3e0c50fd0fb54f25ca6ec1a4b2f81dd457b7f7e4b32fc51e0dfded55e7" exitCode=0 Dec 05 20:28:09 crc kubenswrapper[4904]: I1205 20:28:09.515412 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" event={"ID":"6cab1a63-a096-409d-9ca0-3308b4d6b434","Type":"ContainerDied","Data":"fade0c3e0c50fd0fb54f25ca6ec1a4b2f81dd457b7f7e4b32fc51e0dfded55e7"} Dec 05 20:28:09 crc kubenswrapper[4904]: I1205 20:28:09.515630 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" event={"ID":"6cab1a63-a096-409d-9ca0-3308b4d6b434","Type":"ContainerStarted","Data":"064643383fb210493813320d785851d1a654e881f10f770d135c0fb6e0ba2f54"} Dec 05 20:28:10 crc kubenswrapper[4904]: I1205 20:28:10.523408 4904 generic.go:334] "Generic (PLEG): container finished" podID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerID="cba62377aff7b72ef5f2253c95e5ef70e9c063f0017f42b352fddf86de1f93bd" exitCode=0 Dec 05 20:28:10 crc kubenswrapper[4904]: I1205 20:28:10.523457 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" event={"ID":"6cab1a63-a096-409d-9ca0-3308b4d6b434","Type":"ContainerDied","Data":"cba62377aff7b72ef5f2253c95e5ef70e9c063f0017f42b352fddf86de1f93bd"} Dec 05 20:28:11 crc kubenswrapper[4904]: I1205 20:28:11.530914 4904 generic.go:334] "Generic (PLEG): container finished" podID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerID="d346c40fef3ff1db0b20cbc6958be444c78cab9376528ed18a303384ee300051" exitCode=0 Dec 05 20:28:11 crc kubenswrapper[4904]: I1205 20:28:11.530950 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" event={"ID":"6cab1a63-a096-409d-9ca0-3308b4d6b434","Type":"ContainerDied","Data":"d346c40fef3ff1db0b20cbc6958be444c78cab9376528ed18a303384ee300051"} Dec 05 20:28:12 crc kubenswrapper[4904]: I1205 20:28:12.866589 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.045547 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-bundle\") pod \"6cab1a63-a096-409d-9ca0-3308b4d6b434\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.045606 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-util\") pod \"6cab1a63-a096-409d-9ca0-3308b4d6b434\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.045639 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97pp2\" (UniqueName: \"kubernetes.io/projected/6cab1a63-a096-409d-9ca0-3308b4d6b434-kube-api-access-97pp2\") pod \"6cab1a63-a096-409d-9ca0-3308b4d6b434\" (UID: \"6cab1a63-a096-409d-9ca0-3308b4d6b434\") " Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.046783 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-bundle" (OuterVolumeSpecName: "bundle") pod "6cab1a63-a096-409d-9ca0-3308b4d6b434" (UID: "6cab1a63-a096-409d-9ca0-3308b4d6b434"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.050633 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cab1a63-a096-409d-9ca0-3308b4d6b434-kube-api-access-97pp2" (OuterVolumeSpecName: "kube-api-access-97pp2") pod "6cab1a63-a096-409d-9ca0-3308b4d6b434" (UID: "6cab1a63-a096-409d-9ca0-3308b4d6b434"). InnerVolumeSpecName "kube-api-access-97pp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.067445 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-util" (OuterVolumeSpecName: "util") pod "6cab1a63-a096-409d-9ca0-3308b4d6b434" (UID: "6cab1a63-a096-409d-9ca0-3308b4d6b434"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.146596 4904 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.146637 4904 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cab1a63-a096-409d-9ca0-3308b4d6b434-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.146650 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97pp2\" (UniqueName: \"kubernetes.io/projected/6cab1a63-a096-409d-9ca0-3308b4d6b434-kube-api-access-97pp2\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.547813 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" event={"ID":"6cab1a63-a096-409d-9ca0-3308b4d6b434","Type":"ContainerDied","Data":"064643383fb210493813320d785851d1a654e881f10f770d135c0fb6e0ba2f54"} Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.548184 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064643383fb210493813320d785851d1a654e881f10f770d135c0fb6e0ba2f54" Dec 05 20:28:13 crc kubenswrapper[4904]: I1205 20:28:13.547917 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.178651 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-242nn"] Dec 05 20:28:20 crc kubenswrapper[4904]: E1205 20:28:20.179663 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="pull" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.179697 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="pull" Dec 05 20:28:20 crc kubenswrapper[4904]: E1205 20:28:20.179741 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="util" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.179756 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="util" Dec 05 20:28:20 crc kubenswrapper[4904]: E1205 20:28:20.179785 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="extract" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.179802 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="extract" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.180050 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cab1a63-a096-409d-9ca0-3308b4d6b434" containerName="extract" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.180707 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.183447 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-xqfst" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.200475 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-242nn"] Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.338895 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nl5\" (UniqueName: \"kubernetes.io/projected/3c454cc5-18c2-420a-ac01-657bedda4fa7-kube-api-access-g5nl5\") pod \"openstack-operator-controller-operator-56699b584c-242nn\" (UID: \"3c454cc5-18c2-420a-ac01-657bedda4fa7\") " pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.439806 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nl5\" (UniqueName: \"kubernetes.io/projected/3c454cc5-18c2-420a-ac01-657bedda4fa7-kube-api-access-g5nl5\") pod \"openstack-operator-controller-operator-56699b584c-242nn\" (UID: \"3c454cc5-18c2-420a-ac01-657bedda4fa7\") " pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.464389 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nl5\" (UniqueName: \"kubernetes.io/projected/3c454cc5-18c2-420a-ac01-657bedda4fa7-kube-api-access-g5nl5\") pod \"openstack-operator-controller-operator-56699b584c-242nn\" (UID: \"3c454cc5-18c2-420a-ac01-657bedda4fa7\") " pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.499210 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:20 crc kubenswrapper[4904]: I1205 20:28:20.721668 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-242nn"] Dec 05 20:28:21 crc kubenswrapper[4904]: I1205 20:28:21.618090 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" event={"ID":"3c454cc5-18c2-420a-ac01-657bedda4fa7","Type":"ContainerStarted","Data":"08cfa1f558ed869cac0814bedf2770191bf4bdc3ba925322b855787c6afa6569"} Dec 05 20:28:25 crc kubenswrapper[4904]: I1205 20:28:25.649348 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" event={"ID":"3c454cc5-18c2-420a-ac01-657bedda4fa7","Type":"ContainerStarted","Data":"32128233694e32888b4a95998e8d2079db535f6d6bfc50e113934c634afa506b"} Dec 05 20:28:25 crc kubenswrapper[4904]: I1205 20:28:25.649917 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:29 crc kubenswrapper[4904]: I1205 20:28:29.955586 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:28:29 crc kubenswrapper[4904]: I1205 20:28:29.956256 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:28:30 crc kubenswrapper[4904]: I1205 20:28:30.502828 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" Dec 05 20:28:30 crc kubenswrapper[4904]: I1205 20:28:30.532375 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-242nn" podStartSLOduration=6.329786269 podStartE2EDuration="10.532352901s" podCreationTimestamp="2025-12-05 20:28:20 +0000 UTC" firstStartedPulling="2025-12-05 20:28:20.728135223 +0000 UTC m=+999.539351332" lastFinishedPulling="2025-12-05 20:28:24.930701855 +0000 UTC m=+1003.741917964" observedRunningTime="2025-12-05 20:28:25.684123184 +0000 UTC m=+1004.495339313" watchObservedRunningTime="2025-12-05 20:28:30.532352901 +0000 UTC m=+1009.343569020" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.812919 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.814630 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.822672 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8lntd" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.829323 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.830907 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.832644 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-gbn97" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.836689 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.879904 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.881991 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.889359 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rf8nq" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.889577 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.890831 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.896871 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.907158 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-7xn8h" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.922433 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.923410 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.929130 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.930463 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.931920 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qwxll" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.935134 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.938403 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-454n2" Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.945341 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864"] Dec 05 20:28:51 crc kubenswrapper[4904]: I1205 20:28:51.971665 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.004689 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.008539 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zp49\" (UniqueName: \"kubernetes.io/projected/8ee93c89-5d32-4114-b134-c084359d11ec-kube-api-access-8zp49\") pod \"designate-operator-controller-manager-78b4bc895b-7f864\" (UID: \"8ee93c89-5d32-4114-b134-c084359d11ec\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.008643 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7657\" (UniqueName: \"kubernetes.io/projected/16525ebe-6de2-4974-a4ee-ed99a0e4ea1f-kube-api-access-v7657\") pod \"barbican-operator-controller-manager-7d9dfd778-jrrnh\" (UID: \"16525ebe-6de2-4974-a4ee-ed99a0e4ea1f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.008671 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt44w\" (UniqueName: \"kubernetes.io/projected/799e9803-03c5-4406-8b50-e59dedc0918d-kube-api-access-bt44w\") pod \"glance-operator-controller-manager-77987cd8cd-z4chx\" (UID: \"799e9803-03c5-4406-8b50-e59dedc0918d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.008697 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9mkh\" (UniqueName: \"kubernetes.io/projected/0fefb560-28a2-4316-9448-8361111d4837-kube-api-access-z9mkh\") pod \"cinder-operator-controller-manager-859b6ccc6-xd4c5\" (UID: \"0fefb560-28a2-4316-9448-8361111d4837\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.018140 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-4zb86"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.019368 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.020314 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.020807 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.025589 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-g89j7" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.025803 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.025955 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pp89j" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.040080 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-4zb86"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.074136 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.090637 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.093650 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bd958" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.110516 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82h4m\" (UniqueName: \"kubernetes.io/projected/17322e61-a1f2-4228-8784-ea6869288aaa-kube-api-access-82h4m\") pod \"heat-operator-controller-manager-5f64f6f8bb-lvn6q\" (UID: \"17322e61-a1f2-4228-8784-ea6869288aaa\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.110578 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7cjx\" (UniqueName: \"kubernetes.io/projected/7ae1ca6f-894c-4c12-ab0a-459e97fa442e-kube-api-access-w7cjx\") pod \"horizon-operator-controller-manager-68c6d99b8f-lc48g\" (UID: \"7ae1ca6f-894c-4c12-ab0a-459e97fa442e\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.110630 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7657\" (UniqueName: \"kubernetes.io/projected/16525ebe-6de2-4974-a4ee-ed99a0e4ea1f-kube-api-access-v7657\") pod \"barbican-operator-controller-manager-7d9dfd778-jrrnh\" (UID: \"16525ebe-6de2-4974-a4ee-ed99a0e4ea1f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.110664 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt44w\" (UniqueName: \"kubernetes.io/projected/799e9803-03c5-4406-8b50-e59dedc0918d-kube-api-access-bt44w\") pod \"glance-operator-controller-manager-77987cd8cd-z4chx\" (UID: \"799e9803-03c5-4406-8b50-e59dedc0918d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.110737 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9mkh\" (UniqueName: \"kubernetes.io/projected/0fefb560-28a2-4316-9448-8361111d4837-kube-api-access-z9mkh\") pod \"cinder-operator-controller-manager-859b6ccc6-xd4c5\" (UID: \"0fefb560-28a2-4316-9448-8361111d4837\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.110775 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zp49\" (UniqueName: \"kubernetes.io/projected/8ee93c89-5d32-4114-b134-c084359d11ec-kube-api-access-8zp49\") pod \"designate-operator-controller-manager-78b4bc895b-7f864\" (UID: \"8ee93c89-5d32-4114-b134-c084359d11ec\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.117202 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.133747 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.139633 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7657\" (UniqueName: \"kubernetes.io/projected/16525ebe-6de2-4974-a4ee-ed99a0e4ea1f-kube-api-access-v7657\") pod \"barbican-operator-controller-manager-7d9dfd778-jrrnh\" (UID: \"16525ebe-6de2-4974-a4ee-ed99a0e4ea1f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.145778 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zp49\" (UniqueName: \"kubernetes.io/projected/8ee93c89-5d32-4114-b134-c084359d11ec-kube-api-access-8zp49\") pod \"designate-operator-controller-manager-78b4bc895b-7f864\" (UID: \"8ee93c89-5d32-4114-b134-c084359d11ec\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.165717 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.166821 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.169715 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-h9tzm" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.174682 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt44w\" (UniqueName: \"kubernetes.io/projected/799e9803-03c5-4406-8b50-e59dedc0918d-kube-api-access-bt44w\") pod \"glance-operator-controller-manager-77987cd8cd-z4chx\" (UID: \"799e9803-03c5-4406-8b50-e59dedc0918d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.174787 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9mkh\" (UniqueName: \"kubernetes.io/projected/0fefb560-28a2-4316-9448-8361111d4837-kube-api-access-z9mkh\") pod \"cinder-operator-controller-manager-859b6ccc6-xd4c5\" (UID: \"0fefb560-28a2-4316-9448-8361111d4837\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.199698 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.201258 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.204330 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.208781 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jjwwl" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.213171 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82h4m\" (UniqueName: \"kubernetes.io/projected/17322e61-a1f2-4228-8784-ea6869288aaa-kube-api-access-82h4m\") pod \"heat-operator-controller-manager-5f64f6f8bb-lvn6q\" (UID: \"17322e61-a1f2-4228-8784-ea6869288aaa\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.213208 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7cjx\" (UniqueName: \"kubernetes.io/projected/7ae1ca6f-894c-4c12-ab0a-459e97fa442e-kube-api-access-w7cjx\") pod \"horizon-operator-controller-manager-68c6d99b8f-lc48g\" (UID: \"7ae1ca6f-894c-4c12-ab0a-459e97fa442e\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.213234 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9z6q\" (UniqueName: \"kubernetes.io/projected/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-kube-api-access-q9z6q\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.213254 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcv9s\" (UniqueName: \"kubernetes.io/projected/60eb443e-807c-41f8-8935-7bfacc9dc89b-kube-api-access-gcv9s\") pod \"ironic-operator-controller-manager-6c548fd776-9zhph\" (UID: \"60eb443e-807c-41f8-8935-7bfacc9dc89b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.213273 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc5sf\" (UniqueName: \"kubernetes.io/projected/a2b7747b-01b7-4f12-8748-2661f53078f0-kube-api-access-vc5sf\") pod \"keystone-operator-controller-manager-7765d96ddf-h9wsr\" (UID: \"a2b7747b-01b7-4f12-8748-2661f53078f0\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.213333 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.218652 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.233162 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.235443 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7cjx\" (UniqueName: \"kubernetes.io/projected/7ae1ca6f-894c-4c12-ab0a-459e97fa442e-kube-api-access-w7cjx\") pod \"horizon-operator-controller-manager-68c6d99b8f-lc48g\" (UID: \"7ae1ca6f-894c-4c12-ab0a-459e97fa442e\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.236442 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82h4m\" (UniqueName: \"kubernetes.io/projected/17322e61-a1f2-4228-8784-ea6869288aaa-kube-api-access-82h4m\") pod \"heat-operator-controller-manager-5f64f6f8bb-lvn6q\" (UID: \"17322e61-a1f2-4228-8784-ea6869288aaa\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.239514 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.240696 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.241553 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.243339 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4cg2h" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.245856 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.251046 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.259116 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.260641 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.263792 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tp8g9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.291119 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l65z8"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.292695 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.294333 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.297405 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qx67h" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.310852 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.315459 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9z6q\" (UniqueName: \"kubernetes.io/projected/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-kube-api-access-q9z6q\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.315539 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc5sf\" (UniqueName: \"kubernetes.io/projected/a2b7747b-01b7-4f12-8748-2661f53078f0-kube-api-access-vc5sf\") pod \"keystone-operator-controller-manager-7765d96ddf-h9wsr\" (UID: \"a2b7747b-01b7-4f12-8748-2661f53078f0\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.315567 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcv9s\" (UniqueName: \"kubernetes.io/projected/60eb443e-807c-41f8-8935-7bfacc9dc89b-kube-api-access-gcv9s\") pod \"ironic-operator-controller-manager-6c548fd776-9zhph\" (UID: \"60eb443e-807c-41f8-8935-7bfacc9dc89b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.315647 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.315701 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mls\" (UniqueName: \"kubernetes.io/projected/8c31d222-d349-476a-9aa2-cc57ec51d926-kube-api-access-r6mls\") pod \"mariadb-operator-controller-manager-56bbcc9d85-wq6hj\" (UID: \"8c31d222-d349-476a-9aa2-cc57ec51d926\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.315789 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7m22\" (UniqueName: \"kubernetes.io/projected/e7d93aaa-3c37-4793-8535-6dcce7bb79b0-kube-api-access-d7m22\") pod \"manila-operator-controller-manager-7c79b5df47-xt4ph\" (UID: \"e7d93aaa-3c37-4793-8535-6dcce7bb79b0\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.316609 4904 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.316651 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert podName:4453c00a-291d-4edb-ab56-9b0fdf3b1ea5 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:52.816633668 +0000 UTC m=+1031.627849777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert") pod "infra-operator-controller-manager-57548d458d-4zb86" (UID: "4453c00a-291d-4edb-ab56-9b0fdf3b1ea5") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.316837 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l65z8"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.340281 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.341611 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.342111 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcv9s\" (UniqueName: \"kubernetes.io/projected/60eb443e-807c-41f8-8935-7bfacc9dc89b-kube-api-access-gcv9s\") pod \"ironic-operator-controller-manager-6c548fd776-9zhph\" (UID: \"60eb443e-807c-41f8-8935-7bfacc9dc89b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.344745 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.345098 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-54x5b" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.347129 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc5sf\" (UniqueName: \"kubernetes.io/projected/a2b7747b-01b7-4f12-8748-2661f53078f0-kube-api-access-vc5sf\") pod \"keystone-operator-controller-manager-7765d96ddf-h9wsr\" (UID: \"a2b7747b-01b7-4f12-8748-2661f53078f0\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.348362 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.348738 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.349775 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.350850 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9z6q\" (UniqueName: \"kubernetes.io/projected/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-kube-api-access-q9z6q\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.351336 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5lw92" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.358835 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.364933 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.369165 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.370431 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.377363 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fd8rm" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.404530 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.405645 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.409401 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zqnzm" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.416591 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fk7k\" (UniqueName: \"kubernetes.io/projected/2035ff23-1caf-4e9e-bc39-46caef0eb07d-kube-api-access-5fk7k\") pod \"nova-operator-controller-manager-697bc559fc-xv5gb\" (UID: \"2035ff23-1caf-4e9e-bc39-46caef0eb07d\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.416681 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mls\" (UniqueName: \"kubernetes.io/projected/8c31d222-d349-476a-9aa2-cc57ec51d926-kube-api-access-r6mls\") pod \"mariadb-operator-controller-manager-56bbcc9d85-wq6hj\" (UID: \"8c31d222-d349-476a-9aa2-cc57ec51d926\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.416727 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghtc\" (UniqueName: \"kubernetes.io/projected/03744ad0-4115-4d2f-bf2d-5acc45a6d05a-kube-api-access-9ghtc\") pod \"octavia-operator-controller-manager-998648c74-l65z8\" (UID: \"03744ad0-4115-4d2f-bf2d-5acc45a6d05a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.416746 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7m22\" (UniqueName: \"kubernetes.io/projected/e7d93aaa-3c37-4793-8535-6dcce7bb79b0-kube-api-access-d7m22\") pod \"manila-operator-controller-manager-7c79b5df47-xt4ph\" (UID: \"e7d93aaa-3c37-4793-8535-6dcce7bb79b0\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.416762 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkdvj\" (UniqueName: \"kubernetes.io/projected/7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1-kube-api-access-hkdvj\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-bzms8\" (UID: \"7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.426130 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.443444 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.444012 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.446937 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.447724 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7m22\" (UniqueName: \"kubernetes.io/projected/e7d93aaa-3c37-4793-8535-6dcce7bb79b0-kube-api-access-d7m22\") pod \"manila-operator-controller-manager-7c79b5df47-xt4ph\" (UID: \"e7d93aaa-3c37-4793-8535-6dcce7bb79b0\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.462004 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mls\" (UniqueName: \"kubernetes.io/projected/8c31d222-d349-476a-9aa2-cc57ec51d926-kube-api-access-r6mls\") pod \"mariadb-operator-controller-manager-56bbcc9d85-wq6hj\" (UID: \"8c31d222-d349-476a-9aa2-cc57ec51d926\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.463880 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.500326 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.507705 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.512531 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-msp64" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519325 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbm9\" (UniqueName: \"kubernetes.io/projected/2f0e1e69-f795-4166-b6bd-946050c1524e-kube-api-access-fsbm9\") pod \"swift-operator-controller-manager-5f8c65bbfc-xqdzf\" (UID: \"2f0e1e69-f795-4166-b6bd-946050c1524e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519422 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qltw\" (UniqueName: \"kubernetes.io/projected/50c0879e-d30b-4a8a-972b-10f3188ff06a-kube-api-access-2qltw\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519460 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghtc\" (UniqueName: \"kubernetes.io/projected/03744ad0-4115-4d2f-bf2d-5acc45a6d05a-kube-api-access-9ghtc\") pod \"octavia-operator-controller-manager-998648c74-l65z8\" (UID: \"03744ad0-4115-4d2f-bf2d-5acc45a6d05a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519509 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkdvj\" (UniqueName: \"kubernetes.io/projected/7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1-kube-api-access-hkdvj\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-bzms8\" (UID: \"7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519557 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmb4s\" (UniqueName: \"kubernetes.io/projected/116b1af4-b71b-46ab-9977-12342c13594e-kube-api-access-zmb4s\") pod \"ovn-operator-controller-manager-b6456fdb6-f2bbb\" (UID: \"116b1af4-b71b-46ab-9977-12342c13594e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519593 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fk7k\" (UniqueName: \"kubernetes.io/projected/2035ff23-1caf-4e9e-bc39-46caef0eb07d-kube-api-access-5fk7k\") pod \"nova-operator-controller-manager-697bc559fc-xv5gb\" (UID: \"2035ff23-1caf-4e9e-bc39-46caef0eb07d\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519624 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.519648 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6nvq\" (UniqueName: \"kubernetes.io/projected/53b1878e-6de9-4961-9cf6-4673c09c0412-kube-api-access-l6nvq\") pod \"placement-operator-controller-manager-78f8948974-gv6jv\" (UID: \"53b1878e-6de9-4961-9cf6-4673c09c0412\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.521252 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.547706 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkdvj\" (UniqueName: \"kubernetes.io/projected/7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1-kube-api-access-hkdvj\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-bzms8\" (UID: \"7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.548909 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghtc\" (UniqueName: \"kubernetes.io/projected/03744ad0-4115-4d2f-bf2d-5acc45a6d05a-kube-api-access-9ghtc\") pod \"octavia-operator-controller-manager-998648c74-l65z8\" (UID: \"03744ad0-4115-4d2f-bf2d-5acc45a6d05a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.557282 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fk7k\" (UniqueName: \"kubernetes.io/projected/2035ff23-1caf-4e9e-bc39-46caef0eb07d-kube-api-access-5fk7k\") pod \"nova-operator-controller-manager-697bc559fc-xv5gb\" (UID: \"2035ff23-1caf-4e9e-bc39-46caef0eb07d\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.564824 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.580979 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.611809 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.622097 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.663615 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.665809 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8q5lg" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.673480 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.673511 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6nvq\" (UniqueName: \"kubernetes.io/projected/53b1878e-6de9-4961-9cf6-4673c09c0412-kube-api-access-l6nvq\") pod \"placement-operator-controller-manager-78f8948974-gv6jv\" (UID: \"53b1878e-6de9-4961-9cf6-4673c09c0412\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.673585 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbm9\" (UniqueName: \"kubernetes.io/projected/2f0e1e69-f795-4166-b6bd-946050c1524e-kube-api-access-fsbm9\") pod \"swift-operator-controller-manager-5f8c65bbfc-xqdzf\" (UID: \"2f0e1e69-f795-4166-b6bd-946050c1524e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.673618 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qltw\" (UniqueName: \"kubernetes.io/projected/50c0879e-d30b-4a8a-972b-10f3188ff06a-kube-api-access-2qltw\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.673649 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txbk2\" (UniqueName: \"kubernetes.io/projected/086dd7ac-a5cc-4433-a5a6-0cfd88a69d72-kube-api-access-txbk2\") pod \"telemetry-operator-controller-manager-76cc84c6bb-n6d4m\" (UID: \"086dd7ac-a5cc-4433-a5a6-0cfd88a69d72\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.673683 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmb4s\" (UniqueName: \"kubernetes.io/projected/116b1af4-b71b-46ab-9977-12342c13594e-kube-api-access-zmb4s\") pod \"ovn-operator-controller-manager-b6456fdb6-f2bbb\" (UID: \"116b1af4-b71b-46ab-9977-12342c13594e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.674125 4904 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.674169 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert podName:50c0879e-d30b-4a8a-972b-10f3188ff06a nodeName:}" failed. No retries permitted until 2025-12-05 20:28:53.174155568 +0000 UTC m=+1031.985371677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" (UID: "50c0879e-d30b-4a8a-972b-10f3188ff06a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.674786 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.694643 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.708316 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbm9\" (UniqueName: \"kubernetes.io/projected/2f0e1e69-f795-4166-b6bd-946050c1524e-kube-api-access-fsbm9\") pod \"swift-operator-controller-manager-5f8c65bbfc-xqdzf\" (UID: \"2f0e1e69-f795-4166-b6bd-946050c1524e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.715243 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmb4s\" (UniqueName: \"kubernetes.io/projected/116b1af4-b71b-46ab-9977-12342c13594e-kube-api-access-zmb4s\") pod \"ovn-operator-controller-manager-b6456fdb6-f2bbb\" (UID: \"116b1af4-b71b-46ab-9977-12342c13594e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.715286 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6nvq\" (UniqueName: \"kubernetes.io/projected/53b1878e-6de9-4961-9cf6-4673c09c0412-kube-api-access-l6nvq\") pod \"placement-operator-controller-manager-78f8948974-gv6jv\" (UID: \"53b1878e-6de9-4961-9cf6-4673c09c0412\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.715976 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qltw\" (UniqueName: \"kubernetes.io/projected/50c0879e-d30b-4a8a-972b-10f3188ff06a-kube-api-access-2qltw\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.733599 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.736511 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.778542 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkfrv\" (UniqueName: \"kubernetes.io/projected/54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9-kube-api-access-rkfrv\") pod \"test-operator-controller-manager-5854674fcc-kr7mw\" (UID: \"54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.778590 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txbk2\" (UniqueName: \"kubernetes.io/projected/086dd7ac-a5cc-4433-a5a6-0cfd88a69d72-kube-api-access-txbk2\") pod \"telemetry-operator-controller-manager-76cc84c6bb-n6d4m\" (UID: \"086dd7ac-a5cc-4433-a5a6-0cfd88a69d72\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.779684 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.783680 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.785359 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.789284 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4hpc8" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.804775 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txbk2\" (UniqueName: \"kubernetes.io/projected/086dd7ac-a5cc-4433-a5a6-0cfd88a69d72-kube-api-access-txbk2\") pod \"telemetry-operator-controller-manager-76cc84c6bb-n6d4m\" (UID: \"086dd7ac-a5cc-4433-a5a6-0cfd88a69d72\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.810567 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.823897 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.824655 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.825951 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.827821 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9dgfk" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.828100 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.831789 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.837668 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.845509 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.848344 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.851307 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mkvbx" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.853760 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.861243 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.891295 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.891373 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.891415 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcxx5\" (UniqueName: \"kubernetes.io/projected/e0155f33-dd5a-4c6f-b261-2b7026149e9c-kube-api-access-bcxx5\") pod \"watcher-operator-controller-manager-7f6cb9b975-pjwn4\" (UID: \"e0155f33-dd5a-4c6f-b261-2b7026149e9c\") " pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.891453 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cr45\" (UniqueName: \"kubernetes.io/projected/0964d0c8-bed0-4c26-969d-c8e895793312-kube-api-access-8cr45\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.891499 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.891581 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkfrv\" (UniqueName: \"kubernetes.io/projected/54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9-kube-api-access-rkfrv\") pod \"test-operator-controller-manager-5854674fcc-kr7mw\" (UID: \"54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.894280 4904 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.894360 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert podName:4453c00a-291d-4edb-ab56-9b0fdf3b1ea5 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:53.894338375 +0000 UTC m=+1032.705554484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert") pod "infra-operator-controller-manager-57548d458d-4zb86" (UID: "4453c00a-291d-4edb-ab56-9b0fdf3b1ea5") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.930038 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.935171 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkfrv\" (UniqueName: \"kubernetes.io/projected/54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9-kube-api-access-rkfrv\") pod \"test-operator-controller-manager-5854674fcc-kr7mw\" (UID: \"54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.938789 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g"] Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.980179 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.996728 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.996802 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.996841 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcxx5\" (UniqueName: \"kubernetes.io/projected/e0155f33-dd5a-4c6f-b261-2b7026149e9c-kube-api-access-bcxx5\") pod \"watcher-operator-controller-manager-7f6cb9b975-pjwn4\" (UID: \"e0155f33-dd5a-4c6f-b261-2b7026149e9c\") " pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.996871 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cr45\" (UniqueName: \"kubernetes.io/projected/0964d0c8-bed0-4c26-969d-c8e895793312-kube-api-access-8cr45\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:52 crc kubenswrapper[4904]: I1205 20:28:52.996953 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp256\" (UniqueName: \"kubernetes.io/projected/51c461fe-c535-4a35-8409-f319c4549ecf-kube-api-access-sp256\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4ddwg\" (UID: \"51c461fe-c535-4a35-8409-f319c4549ecf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.997126 4904 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:28:52 crc kubenswrapper[4904]: E1205 20:28:52.997186 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:53.497166712 +0000 UTC m=+1032.308382821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "metrics-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:52.997764 4904 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:52.997847 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:53.49782655 +0000 UTC m=+1032.309042709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: W1205 20:28:53.004780 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17322e61_a1f2_4228_8784_ea6869288aaa.slice/crio-a632992774b4bb6fd84a9df8a2115f9f7324a51dc8018ecb82bd782b0c27a227 WatchSource:0}: Error finding container a632992774b4bb6fd84a9df8a2115f9f7324a51dc8018ecb82bd782b0c27a227: Status 404 returned error can't find the container with id a632992774b4bb6fd84a9df8a2115f9f7324a51dc8018ecb82bd782b0c27a227 Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.015758 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcxx5\" (UniqueName: \"kubernetes.io/projected/e0155f33-dd5a-4c6f-b261-2b7026149e9c-kube-api-access-bcxx5\") pod \"watcher-operator-controller-manager-7f6cb9b975-pjwn4\" (UID: \"e0155f33-dd5a-4c6f-b261-2b7026149e9c\") " pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.021960 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cr45\" (UniqueName: \"kubernetes.io/projected/0964d0c8-bed0-4c26-969d-c8e895793312-kube-api-access-8cr45\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.033205 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.098163 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp256\" (UniqueName: \"kubernetes.io/projected/51c461fe-c535-4a35-8409-f319c4549ecf-kube-api-access-sp256\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4ddwg\" (UID: \"51c461fe-c535-4a35-8409-f319c4549ecf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.116982 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp256\" (UniqueName: \"kubernetes.io/projected/51c461fe-c535-4a35-8409-f319c4549ecf-kube-api-access-sp256\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4ddwg\" (UID: \"51c461fe-c535-4a35-8409-f319c4549ecf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.162049 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.203880 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.204107 4904 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.204194 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert podName:50c0879e-d30b-4a8a-972b-10f3188ff06a nodeName:}" failed. No retries permitted until 2025-12-05 20:28:54.204176143 +0000 UTC m=+1033.015392252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" (UID: "50c0879e-d30b-4a8a-972b-10f3188ff06a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.240607 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.317106 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.327884 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx"] Dec 05 20:28:53 crc kubenswrapper[4904]: W1205 20:28:53.332223 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee93c89_5d32_4114_b134_c084359d11ec.slice/crio-81596151bce5562bbbd68218ab7a76eaf04d5ce9c4c7683f93d4d332da7910c1 WatchSource:0}: Error finding container 81596151bce5562bbbd68218ab7a76eaf04d5ce9c4c7683f93d4d332da7910c1: Status 404 returned error can't find the container with id 81596151bce5562bbbd68218ab7a76eaf04d5ce9c4c7683f93d4d332da7910c1 Dec 05 20:28:53 crc kubenswrapper[4904]: W1205 20:28:53.335333 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799e9803_03c5_4406_8b50_e59dedc0918d.slice/crio-b0125205bd931fb3c0b57b76188079dd47413ba293efee75ce9853fa920645dc WatchSource:0}: Error finding container b0125205bd931fb3c0b57b76188079dd47413ba293efee75ce9853fa920645dc: Status 404 returned error can't find the container with id b0125205bd931fb3c0b57b76188079dd47413ba293efee75ce9853fa920645dc Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.510148 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.510301 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.510384 4904 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.510477 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:54.51044933 +0000 UTC m=+1033.321665459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "metrics-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.510535 4904 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.510687 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:54.510641805 +0000 UTC m=+1033.321857994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.563576 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.624639 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.635308 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.658109 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.662620 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.712130 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l65z8"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.712504 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.712583 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.717872 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.759982 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8"] Dec 05 20:28:53 crc kubenswrapper[4904]: W1205 20:28:53.766254 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cbe4ebd_943a_4ffe_8cf1_5ade3e005ab1.slice/crio-42c2afe8265aac11abfdd87fddc0348dcdb71535d9a095002c7034f26b003eaa WatchSource:0}: Error finding container 42c2afe8265aac11abfdd87fddc0348dcdb71535d9a095002c7034f26b003eaa: Status 404 returned error can't find the container with id 42c2afe8265aac11abfdd87fddc0348dcdb71535d9a095002c7034f26b003eaa Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.773332 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.791461 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw"] Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.793357 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vc5sf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-h9wsr_openstack-operators(a2b7747b-01b7-4f12-8748-2661f53078f0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.798147 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rkfrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-kr7mw_openstack-operators(54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.799860 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vc5sf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-h9wsr_openstack-operators(a2b7747b-01b7-4f12-8748-2661f53078f0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.804582 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" podUID="a2b7747b-01b7-4f12-8748-2661f53078f0" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.804727 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsbm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-xqdzf_openstack-operators(2f0e1e69-f795-4166-b6bd-946050c1524e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.804811 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rkfrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-kr7mw_openstack-operators(54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.806347 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.9:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcxx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f6cb9b975-pjwn4_openstack-operators(e0155f33-dd5a-4c6f-b261-2b7026149e9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.806795 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" podUID="54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.807739 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsbm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-xqdzf_openstack-operators(2f0e1e69-f795-4166-b6bd-946050c1524e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.808904 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf"] Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.808915 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" podUID="2f0e1e69-f795-4166-b6bd-946050c1524e" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.810797 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcxx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f6cb9b975-pjwn4_openstack-operators(e0155f33-dd5a-4c6f-b261-2b7026149e9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.811994 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" podUID="e0155f33-dd5a-4c6f-b261-2b7026149e9c" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.815706 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zmb4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-f2bbb_openstack-operators(116b1af4-b71b-46ab-9977-12342c13594e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.816284 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb"] Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.817380 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zmb4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-f2bbb_openstack-operators(116b1af4-b71b-46ab-9977-12342c13594e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.818579 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" podUID="116b1af4-b71b-46ab-9977-12342c13594e" Dec 05 20:28:53 crc kubenswrapper[4904]: W1205 20:28:53.823106 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c461fe_c535_4a35_8409_f319c4549ecf.slice/crio-b9362f7b828c6c491015fa10857aac519b9d7c1330272d9d1858196354e79eb6 WatchSource:0}: Error finding container b9362f7b828c6c491015fa10857aac519b9d7c1330272d9d1858196354e79eb6: Status 404 returned error can't find the container with id b9362f7b828c6c491015fa10857aac519b9d7c1330272d9d1858196354e79eb6 Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.825976 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4"] Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.828905 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sp256,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4ddwg_openstack-operators(51c461fe-c535-4a35-8409-f319c4549ecf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.830268 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" podUID="51c461fe-c535-4a35-8409-f319c4549ecf" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.835072 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg"] Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.853926 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" event={"ID":"799e9803-03c5-4406-8b50-e59dedc0918d","Type":"ContainerStarted","Data":"b0125205bd931fb3c0b57b76188079dd47413ba293efee75ce9853fa920645dc"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.856725 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" event={"ID":"e7d93aaa-3c37-4793-8535-6dcce7bb79b0","Type":"ContainerStarted","Data":"2cb014b964dc27d8f09195643fae5eaf804f6d8870ef36aedb176064bca041f4"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.858476 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" event={"ID":"17322e61-a1f2-4228-8784-ea6869288aaa","Type":"ContainerStarted","Data":"a632992774b4bb6fd84a9df8a2115f9f7324a51dc8018ecb82bd782b0c27a227"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.860266 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" event={"ID":"8ee93c89-5d32-4114-b134-c084359d11ec","Type":"ContainerStarted","Data":"81596151bce5562bbbd68218ab7a76eaf04d5ce9c4c7683f93d4d332da7910c1"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.862819 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" event={"ID":"53b1878e-6de9-4961-9cf6-4673c09c0412","Type":"ContainerStarted","Data":"69a33b8b27304a8b1c7c91bff993a44450596b8434eece5a3c2fce5a21cf024a"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.864004 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" event={"ID":"116b1af4-b71b-46ab-9977-12342c13594e","Type":"ContainerStarted","Data":"9f1e6b6fd8f12bd227b4cb999a6d478cfcd0759433a337e08cda93c758b2a9b6"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.868737 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" event={"ID":"e0155f33-dd5a-4c6f-b261-2b7026149e9c","Type":"ContainerStarted","Data":"6ea29af362e64ea3d730f22454280dd04ed616635d27bb432ab2b2a1c37e0eb2"} Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.869299 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" podUID="116b1af4-b71b-46ab-9977-12342c13594e" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.869730 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" event={"ID":"086dd7ac-a5cc-4433-a5a6-0cfd88a69d72","Type":"ContainerStarted","Data":"deb1790c3b3cfe8deca245aa288d6c3a3ad06228943eac9a9078d20e7fdd8417"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.870667 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" event={"ID":"7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1","Type":"ContainerStarted","Data":"42c2afe8265aac11abfdd87fddc0348dcdb71535d9a095002c7034f26b003eaa"} Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.871207 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" podUID="e0155f33-dd5a-4c6f-b261-2b7026149e9c" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.871790 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" event={"ID":"51c461fe-c535-4a35-8409-f319c4549ecf","Type":"ContainerStarted","Data":"b9362f7b828c6c491015fa10857aac519b9d7c1330272d9d1858196354e79eb6"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.873416 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" event={"ID":"2f0e1e69-f795-4166-b6bd-946050c1524e","Type":"ContainerStarted","Data":"5da984c9d284819629b794746e887bd95f87e57107c86c1b4c6fff41f21c97d8"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.874236 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" event={"ID":"8c31d222-d349-476a-9aa2-cc57ec51d926","Type":"ContainerStarted","Data":"8aa632f7aa98113985cfb2699122a4977c90eaae929ae6ae9481e8c8302309c3"} Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.874402 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" podUID="2f0e1e69-f795-4166-b6bd-946050c1524e" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.873602 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" podUID="51c461fe-c535-4a35-8409-f319c4549ecf" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.875555 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" event={"ID":"a2b7747b-01b7-4f12-8748-2661f53078f0","Type":"ContainerStarted","Data":"77225ecbbfc2b992e1399a2b1df777fd16b218e594ad3f05d9ae4910cc01566c"} Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.876781 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" podUID="a2b7747b-01b7-4f12-8748-2661f53078f0" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.877599 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" event={"ID":"16525ebe-6de2-4974-a4ee-ed99a0e4ea1f","Type":"ContainerStarted","Data":"437c2ae7f3fab722024e5c8b0cd909559690444ad0ab2ce330b4347a92cc14a8"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.879955 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" event={"ID":"2035ff23-1caf-4e9e-bc39-46caef0eb07d","Type":"ContainerStarted","Data":"e645c22546e50167cf3295ebb98bdd2442f774c9fb4e111501d9475a00dcce35"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.884887 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" event={"ID":"0fefb560-28a2-4316-9448-8361111d4837","Type":"ContainerStarted","Data":"3c71fde047c76c5f26022f8314b43b8bbb1090ef0c839a6b79ef170acde9780f"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.885621 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" event={"ID":"60eb443e-807c-41f8-8935-7bfacc9dc89b","Type":"ContainerStarted","Data":"9fa2defd7df5080ca5792635e8c3d41fd61a332f3646e09d67334a8dd79b5ec2"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.886881 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" event={"ID":"54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9","Type":"ContainerStarted","Data":"7a392c67755e3a6000fe5708246c8f9a8f6bb237f324985dde3007b0951617e1"} Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.888388 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" podUID="54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9" Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.888504 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" event={"ID":"7ae1ca6f-894c-4c12-ab0a-459e97fa442e","Type":"ContainerStarted","Data":"d5ad5d452120b471d7243856f502f447a25187354fbb85ef39d52c09c2060774"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.889745 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" event={"ID":"03744ad0-4115-4d2f-bf2d-5acc45a6d05a","Type":"ContainerStarted","Data":"cc76727e8fae97395c7856b21cbefe0abee5b15c3870f8b2626fbffbc0b6fad0"} Dec 05 20:28:53 crc kubenswrapper[4904]: I1205 20:28:53.918233 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.918705 4904 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:53 crc kubenswrapper[4904]: E1205 20:28:53.918819 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert podName:4453c00a-291d-4edb-ab56-9b0fdf3b1ea5 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:55.918801154 +0000 UTC m=+1034.730017253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert") pod "infra-operator-controller-manager-57548d458d-4zb86" (UID: "4453c00a-291d-4edb-ab56-9b0fdf3b1ea5") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: I1205 20:28:54.225850 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.226234 4904 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.226367 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert podName:50c0879e-d30b-4a8a-972b-10f3188ff06a nodeName:}" failed. No retries permitted until 2025-12-05 20:28:56.226344147 +0000 UTC m=+1035.037560256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" (UID: "50c0879e-d30b-4a8a-972b-10f3188ff06a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: I1205 20:28:54.529824 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:54 crc kubenswrapper[4904]: I1205 20:28:54.529880 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.530024 4904 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.530113 4904 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.530124 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:56.530104665 +0000 UTC m=+1035.341320774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "metrics-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.530240 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:56.530202407 +0000 UTC m=+1035.341418506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "webhook-server-cert" not found Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.909277 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" podUID="2f0e1e69-f795-4166-b6bd-946050c1524e" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.910353 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" podUID="116b1af4-b71b-46ab-9977-12342c13594e" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.910450 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" podUID="a2b7747b-01b7-4f12-8748-2661f53078f0" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.911237 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" podUID="51c461fe-c535-4a35-8409-f319c4549ecf" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.913913 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" podUID="e0155f33-dd5a-4c6f-b261-2b7026149e9c" Dec 05 20:28:54 crc kubenswrapper[4904]: E1205 20:28:54.916230 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" podUID="54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9" Dec 05 20:28:55 crc kubenswrapper[4904]: I1205 20:28:55.955466 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:28:55 crc kubenswrapper[4904]: E1205 20:28:55.955746 4904 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:55 crc kubenswrapper[4904]: E1205 20:28:55.955841 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert podName:4453c00a-291d-4edb-ab56-9b0fdf3b1ea5 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:59.955809249 +0000 UTC m=+1038.767025358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert") pod "infra-operator-controller-manager-57548d458d-4zb86" (UID: "4453c00a-291d-4edb-ab56-9b0fdf3b1ea5") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:56 crc kubenswrapper[4904]: I1205 20:28:56.259883 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:28:56 crc kubenswrapper[4904]: E1205 20:28:56.260082 4904 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:56 crc kubenswrapper[4904]: E1205 20:28:56.260222 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert podName:50c0879e-d30b-4a8a-972b-10f3188ff06a nodeName:}" failed. No retries permitted until 2025-12-05 20:29:00.260202055 +0000 UTC m=+1039.071418164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" (UID: "50c0879e-d30b-4a8a-972b-10f3188ff06a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:56 crc kubenswrapper[4904]: I1205 20:28:56.564488 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:56 crc kubenswrapper[4904]: I1205 20:28:56.564568 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:28:56 crc kubenswrapper[4904]: E1205 20:28:56.564721 4904 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:28:56 crc kubenswrapper[4904]: E1205 20:28:56.564747 4904 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:28:56 crc kubenswrapper[4904]: E1205 20:28:56.564793 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:29:00.564776235 +0000 UTC m=+1039.375992344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "webhook-server-cert" not found Dec 05 20:28:56 crc kubenswrapper[4904]: E1205 20:28:56.564811 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:29:00.564804205 +0000 UTC m=+1039.376020314 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "metrics-server-cert" not found Dec 05 20:28:59 crc kubenswrapper[4904]: I1205 20:28:59.956113 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:28:59 crc kubenswrapper[4904]: I1205 20:28:59.956943 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:29:00 crc kubenswrapper[4904]: I1205 20:29:00.029272 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.029601 4904 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.029658 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert podName:4453c00a-291d-4edb-ab56-9b0fdf3b1ea5 nodeName:}" failed. No retries permitted until 2025-12-05 20:29:08.029641104 +0000 UTC m=+1046.840857213 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert") pod "infra-operator-controller-manager-57548d458d-4zb86" (UID: "4453c00a-291d-4edb-ab56-9b0fdf3b1ea5") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: I1205 20:29:00.336113 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.336281 4904 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.336344 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert podName:50c0879e-d30b-4a8a-972b-10f3188ff06a nodeName:}" failed. No retries permitted until 2025-12-05 20:29:08.336325784 +0000 UTC m=+1047.147541893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" (UID: "50c0879e-d30b-4a8a-972b-10f3188ff06a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: I1205 20:29:00.641070 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:00 crc kubenswrapper[4904]: I1205 20:29:00.641151 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.641282 4904 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.641327 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:29:08.641314067 +0000 UTC m=+1047.452530176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "webhook-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.641644 4904 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:29:00 crc kubenswrapper[4904]: E1205 20:29:00.641671 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs podName:0964d0c8-bed0-4c26-969d-c8e895793312 nodeName:}" failed. No retries permitted until 2025-12-05 20:29:08.641664856 +0000 UTC m=+1047.452880955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-jxjt2" (UID: "0964d0c8-bed0-4c26-969d-c8e895793312") : secret "metrics-server-cert" not found Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.055101 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.055674 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z9mkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-xd4c5_openstack-operators(0fefb560-28a2-4316-9448-8361111d4837): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.513531 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.513728 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5fk7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-xv5gb_openstack-operators(2035ff23-1caf-4e9e-bc39-46caef0eb07d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.953451 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-txbk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-n6d4m_openstack-operators(086dd7ac-a5cc-4433-a5a6-0cfd88a69d72): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.956416 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" podUID="086dd7ac-a5cc-4433-a5a6-0cfd88a69d72" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.974456 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bt44w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-z4chx_openstack-operators(799e9803-03c5-4406-8b50-e59dedc0918d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.975110 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zp49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-7f864_openstack-operators(8ee93c89-5d32-4114-b134-c084359d11ec): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.976240 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" podUID="8ee93c89-5d32-4114-b134-c084359d11ec" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.976282 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" podUID="799e9803-03c5-4406-8b50-e59dedc0918d" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.987713 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ghtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-l65z8_openstack-operators(03744ad0-4115-4d2f-bf2d-5acc45a6d05a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:29:07 crc kubenswrapper[4904]: E1205 20:29:07.989441 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" podUID="03744ad0-4115-4d2f-bf2d-5acc45a6d05a" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.010950 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" event={"ID":"17322e61-a1f2-4228-8784-ea6869288aaa","Type":"ContainerStarted","Data":"3d5b71a76cd33504aa9c5de4279b9eaae66f5985f9f9b5d9d4fd934002e13520"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.012463 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" event={"ID":"799e9803-03c5-4406-8b50-e59dedc0918d","Type":"ContainerStarted","Data":"63a45342b9250e561dd4b5d00f8c5144104921ccc837862e7c4bfdb77750879b"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.013264 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:29:08 crc kubenswrapper[4904]: E1205 20:29:08.021015 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" podUID="799e9803-03c5-4406-8b50-e59dedc0918d" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.027783 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" event={"ID":"086dd7ac-a5cc-4433-a5a6-0cfd88a69d72","Type":"ContainerStarted","Data":"4e7faccfdec6fd1b3d812159444287d86f7e7547f509d3fc114a686b506b62bc"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.028610 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:29:08 crc kubenswrapper[4904]: E1205 20:29:08.029350 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" podUID="086dd7ac-a5cc-4433-a5a6-0cfd88a69d72" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.037176 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" event={"ID":"7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1","Type":"ContainerStarted","Data":"00d9b9b92bb93b5cc012bb3a998307b60326076f2a4b96386deafa2ee9a44ee9"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.043788 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" event={"ID":"7ae1ca6f-894c-4c12-ab0a-459e97fa442e","Type":"ContainerStarted","Data":"b907ec52dacf48e02dd050943ef78858ccb4ca838afb6d2b059f447a0e2a9e8a"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.046898 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" event={"ID":"03744ad0-4115-4d2f-bf2d-5acc45a6d05a","Type":"ContainerStarted","Data":"0222d4635ba2f6b8b2473cdc64d475805f8e28ebaf926f88003b88ed2bd4423e"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.047553 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.049590 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:29:08 crc kubenswrapper[4904]: E1205 20:29:08.050192 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" podUID="03744ad0-4115-4d2f-bf2d-5acc45a6d05a" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.055677 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4453c00a-291d-4edb-ab56-9b0fdf3b1ea5-cert\") pod \"infra-operator-controller-manager-57548d458d-4zb86\" (UID: \"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.073905 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" event={"ID":"8ee93c89-5d32-4114-b134-c084359d11ec","Type":"ContainerStarted","Data":"6369bfb7a95702d0109871535fe64abf75b4cd908f58edf82ec09d4d3c45a5e4"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.074655 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:29:08 crc kubenswrapper[4904]: E1205 20:29:08.075152 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" podUID="8ee93c89-5d32-4114-b134-c084359d11ec" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.076575 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" event={"ID":"53b1878e-6de9-4961-9cf6-4673c09c0412","Type":"ContainerStarted","Data":"f6e2365b4de5b617d0971cbf58d5d62f1f20b5ac6711b7dc62d5fde80f076c87"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.078077 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" event={"ID":"16525ebe-6de2-4974-a4ee-ed99a0e4ea1f","Type":"ContainerStarted","Data":"7e57756df9d2e4baa9adf20e0aad0c66496c2779ff59092228d7848d301dd400"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.111613 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" event={"ID":"60eb443e-807c-41f8-8935-7bfacc9dc89b","Type":"ContainerStarted","Data":"250d22c06a0b8838a8e0e0101dd26aa00d65b78c3795539312c4265efd19b3a1"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.176102 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" event={"ID":"e7d93aaa-3c37-4793-8535-6dcce7bb79b0","Type":"ContainerStarted","Data":"3a880a52e7412d374debd37130178cb23a59d3659c15a202f7d27e5e50068e2b"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.191036 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" event={"ID":"8c31d222-d349-476a-9aa2-cc57ec51d926","Type":"ContainerStarted","Data":"ed3f1c6ff238ff54cd5de8f8f97597dcd13c0aabc4c86bdfa558397a530a2b44"} Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.256466 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.353601 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.358792 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50c0879e-d30b-4a8a-972b-10f3188ff06a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9\" (UID: \"50c0879e-d30b-4a8a-972b-10f3188ff06a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.605539 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.656961 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.657006 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.660791 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.664272 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0964d0c8-bed0-4c26-969d-c8e895793312-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-jxjt2\" (UID: \"0964d0c8-bed0-4c26-969d-c8e895793312\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:08 crc kubenswrapper[4904]: I1205 20:29:08.777002 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:09 crc kubenswrapper[4904]: E1205 20:29:09.227231 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" podUID="086dd7ac-a5cc-4433-a5a6-0cfd88a69d72" Dec 05 20:29:09 crc kubenswrapper[4904]: E1205 20:29:09.227255 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" podUID="03744ad0-4115-4d2f-bf2d-5acc45a6d05a" Dec 05 20:29:09 crc kubenswrapper[4904]: E1205 20:29:09.227567 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" podUID="8ee93c89-5d32-4114-b134-c084359d11ec" Dec 05 20:29:09 crc kubenswrapper[4904]: E1205 20:29:09.235178 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" podUID="799e9803-03c5-4406-8b50-e59dedc0918d" Dec 05 20:29:09 crc kubenswrapper[4904]: I1205 20:29:09.976528 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2"] Dec 05 20:29:10 crc kubenswrapper[4904]: W1205 20:29:10.083411 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0964d0c8_bed0_4c26_969d_c8e895793312.slice/crio-e714e22985d060c47c7b68146f1fe2012db63e130b5bc36235a1dda43969171d WatchSource:0}: Error finding container e714e22985d060c47c7b68146f1fe2012db63e130b5bc36235a1dda43969171d: Status 404 returned error can't find the container with id e714e22985d060c47c7b68146f1fe2012db63e130b5bc36235a1dda43969171d Dec 05 20:29:10 crc kubenswrapper[4904]: I1205 20:29:10.234478 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9"] Dec 05 20:29:10 crc kubenswrapper[4904]: I1205 20:29:10.240200 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" event={"ID":"0964d0c8-bed0-4c26-969d-c8e895793312","Type":"ContainerStarted","Data":"e714e22985d060c47c7b68146f1fe2012db63e130b5bc36235a1dda43969171d"} Dec 05 20:29:10 crc kubenswrapper[4904]: I1205 20:29:10.289207 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-4zb86"] Dec 05 20:29:11 crc kubenswrapper[4904]: W1205 20:29:11.000931 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50c0879e_d30b_4a8a_972b_10f3188ff06a.slice/crio-55ef3c782ddd61037e6efcc41d49f8bb13ff91c36feb0a650575633a0ad06d1b WatchSource:0}: Error finding container 55ef3c782ddd61037e6efcc41d49f8bb13ff91c36feb0a650575633a0ad06d1b: Status 404 returned error can't find the container with id 55ef3c782ddd61037e6efcc41d49f8bb13ff91c36feb0a650575633a0ad06d1b Dec 05 20:29:11 crc kubenswrapper[4904]: W1205 20:29:11.007208 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4453c00a_291d_4edb_ab56_9b0fdf3b1ea5.slice/crio-2fd245e1e0943bc1577d560ca6666e70dbf3d7fcfdac7564c7ee37566edcf9d7 WatchSource:0}: Error finding container 2fd245e1e0943bc1577d560ca6666e70dbf3d7fcfdac7564c7ee37566edcf9d7: Status 404 returned error can't find the container with id 2fd245e1e0943bc1577d560ca6666e70dbf3d7fcfdac7564c7ee37566edcf9d7 Dec 05 20:29:11 crc kubenswrapper[4904]: I1205 20:29:11.258485 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" event={"ID":"50c0879e-d30b-4a8a-972b-10f3188ff06a","Type":"ContainerStarted","Data":"55ef3c782ddd61037e6efcc41d49f8bb13ff91c36feb0a650575633a0ad06d1b"} Dec 05 20:29:11 crc kubenswrapper[4904]: I1205 20:29:11.259334 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" event={"ID":"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5","Type":"ContainerStarted","Data":"2fd245e1e0943bc1577d560ca6666e70dbf3d7fcfdac7564c7ee37566edcf9d7"} Dec 05 20:29:12 crc kubenswrapper[4904]: I1205 20:29:12.213328 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" Dec 05 20:29:12 crc kubenswrapper[4904]: E1205 20:29:12.215297 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" podUID="799e9803-03c5-4406-8b50-e59dedc0918d" Dec 05 20:29:12 crc kubenswrapper[4904]: I1205 20:29:12.222674 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" Dec 05 20:29:12 crc kubenswrapper[4904]: E1205 20:29:12.224627 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" podUID="8ee93c89-5d32-4114-b134-c084359d11ec" Dec 05 20:29:12 crc kubenswrapper[4904]: I1205 20:29:12.707670 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" Dec 05 20:29:12 crc kubenswrapper[4904]: E1205 20:29:12.710368 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" podUID="03744ad0-4115-4d2f-bf2d-5acc45a6d05a" Dec 05 20:29:12 crc kubenswrapper[4904]: I1205 20:29:12.864745 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" Dec 05 20:29:12 crc kubenswrapper[4904]: E1205 20:29:12.866675 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" podUID="086dd7ac-a5cc-4433-a5a6-0cfd88a69d72" Dec 05 20:29:22 crc kubenswrapper[4904]: I1205 20:29:22.347511 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" event={"ID":"0964d0c8-bed0-4c26-969d-c8e895793312","Type":"ContainerStarted","Data":"167f40ee490619f1503150dcb571b22ce91e2b95749c32506899efdcc4a717eb"} Dec 05 20:29:22 crc kubenswrapper[4904]: I1205 20:29:22.348084 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:22 crc kubenswrapper[4904]: I1205 20:29:22.372509 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" podStartSLOduration=30.372489928 podStartE2EDuration="30.372489928s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:29:22.370024049 +0000 UTC m=+1061.181240178" watchObservedRunningTime="2025-12-05 20:29:22.372489928 +0000 UTC m=+1061.183706047" Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.355133 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" event={"ID":"a2b7747b-01b7-4f12-8748-2661f53078f0","Type":"ContainerStarted","Data":"c821631568e41fcee94582dfdbf39651047f6c750121d6776fb339b64437f4dd"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.356562 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" event={"ID":"e0155f33-dd5a-4c6f-b261-2b7026149e9c","Type":"ContainerStarted","Data":"931758db3b0afa176408ffd5fd298a9d35bdb4d520c492370accffdbb93d479f"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.358140 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" event={"ID":"51c461fe-c535-4a35-8409-f319c4549ecf","Type":"ContainerStarted","Data":"0c3807f5cb83fcdd65168dfd6b7bb0fc13f1fdfbf9e4e12b45d56820e6275fad"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.360128 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" event={"ID":"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5","Type":"ContainerStarted","Data":"6c8e532d27dc7b30f6ada5924957bdee70e0751678437159b021f70e3a8c75d0"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.361496 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" event={"ID":"0fefb560-28a2-4316-9448-8361111d4837","Type":"ContainerStarted","Data":"9c5c2046040f80d5fe00516c8271938228d45044b833f3f2eff088e47c5e9745"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.364491 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" event={"ID":"7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1","Type":"ContainerStarted","Data":"2a6d79236b4fed92f6a189c29de246e961bff116ce3771df239db511716b8266"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.365758 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" event={"ID":"2f0e1e69-f795-4166-b6bd-946050c1524e","Type":"ContainerStarted","Data":"945511ef65ccccaca266bdaa3c828fe54aa978ea158f4df8f723409185ead615"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.366959 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" event={"ID":"50c0879e-d30b-4a8a-972b-10f3188ff06a","Type":"ContainerStarted","Data":"481d4f4551b874614b108cf6bd423fccf54bbb285a0ed5712d700964bd9cf550"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.367913 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" event={"ID":"116b1af4-b71b-46ab-9977-12342c13594e","Type":"ContainerStarted","Data":"98f9fcb0d2ba522153c79fa7259ee891fed28925becc5c0239a202c691d26923"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.369629 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" event={"ID":"54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9","Type":"ContainerStarted","Data":"18b7538721956eb012d4696d30eb41de7bcf638b7dddb14e6329f194cd3e78d6"} Dec 05 20:29:23 crc kubenswrapper[4904]: I1205 20:29:23.379194 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4ddwg" podStartSLOduration=3.158011097 podStartE2EDuration="31.379177512s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.828805644 +0000 UTC m=+1032.640021753" lastFinishedPulling="2025-12-05 20:29:22.049972059 +0000 UTC m=+1060.861188168" observedRunningTime="2025-12-05 20:29:23.374813451 +0000 UTC m=+1062.186029560" watchObservedRunningTime="2025-12-05 20:29:23.379177512 +0000 UTC m=+1062.190393621" Dec 05 20:29:25 crc kubenswrapper[4904]: I1205 20:29:25.382738 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" event={"ID":"8c31d222-d349-476a-9aa2-cc57ec51d926","Type":"ContainerStarted","Data":"7d11d6adf2ac65bbcb6b735363eca7e483fd9a1c241f93ca9b458bc8d00d4ed2"} Dec 05 20:29:28 crc kubenswrapper[4904]: I1205 20:29:28.784877 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-jxjt2" Dec 05 20:29:29 crc kubenswrapper[4904]: I1205 20:29:29.955364 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:29:29 crc kubenswrapper[4904]: I1205 20:29:29.955697 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:29:29 crc kubenswrapper[4904]: I1205 20:29:29.955746 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:29:29 crc kubenswrapper[4904]: I1205 20:29:29.956334 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a6fee22a5899dc7491b94f9ec1bfdefcfd49d1d911077fc82ba25da5a725a66"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:29:29 crc kubenswrapper[4904]: I1205 20:29:29.956402 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://3a6fee22a5899dc7491b94f9ec1bfdefcfd49d1d911077fc82ba25da5a725a66" gracePeriod=600 Dec 05 20:29:30 crc kubenswrapper[4904]: I1205 20:29:30.420231 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" event={"ID":"16525ebe-6de2-4974-a4ee-ed99a0e4ea1f","Type":"ContainerStarted","Data":"59d7f6948396e1a7f72bd0fe401abc8d266d193727dea0a6e48732b5b0a4aa91"} Dec 05 20:29:30 crc kubenswrapper[4904]: E1205 20:29:30.781145 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" podUID="0fefb560-28a2-4316-9448-8361111d4837" Dec 05 20:29:31 crc kubenswrapper[4904]: I1205 20:29:31.429123 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" event={"ID":"60eb443e-807c-41f8-8935-7bfacc9dc89b","Type":"ContainerStarted","Data":"1ca05883fad74f6fd219e7561aca7132222688848b7cd59fa183fb926ecde699"} Dec 05 20:29:31 crc kubenswrapper[4904]: I1205 20:29:31.431126 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" event={"ID":"17322e61-a1f2-4228-8784-ea6869288aaa","Type":"ContainerStarted","Data":"2c2204e881b239384227c30ec5d689d15c39808d919a398cd38439a5bb1376d8"} Dec 05 20:29:31 crc kubenswrapper[4904]: I1205 20:29:31.432725 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" event={"ID":"2035ff23-1caf-4e9e-bc39-46caef0eb07d","Type":"ContainerStarted","Data":"afb86fc81604391cda9fc2b56b0f9a8311eba3b56dbad9160b55c03a89853770"} Dec 05 20:29:31 crc kubenswrapper[4904]: I1205 20:29:31.433112 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:29:31 crc kubenswrapper[4904]: I1205 20:29:31.435150 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" Dec 05 20:29:31 crc kubenswrapper[4904]: I1205 20:29:31.474539 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-bzms8" podStartSLOduration=11.128468565 podStartE2EDuration="39.474519679s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.793163894 +0000 UTC m=+1032.604380003" lastFinishedPulling="2025-12-05 20:29:22.139214998 +0000 UTC m=+1060.950431117" observedRunningTime="2025-12-05 20:29:31.466335632 +0000 UTC m=+1070.277551751" watchObservedRunningTime="2025-12-05 20:29:31.474519679 +0000 UTC m=+1070.285735798" Dec 05 20:29:32 crc kubenswrapper[4904]: E1205 20:29:32.417128 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" podUID="2035ff23-1caf-4e9e-bc39-46caef0eb07d" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.457403 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="3a6fee22a5899dc7491b94f9ec1bfdefcfd49d1d911077fc82ba25da5a725a66" exitCode=0 Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.457481 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"3a6fee22a5899dc7491b94f9ec1bfdefcfd49d1d911077fc82ba25da5a725a66"} Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.457531 4904 scope.go:117] "RemoveContainer" containerID="8714f5a82d22d92591b2585f5d959b3d2c6d45de1c2f0d4263e0b6b75477d050" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.466723 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" event={"ID":"2f0e1e69-f795-4166-b6bd-946050c1524e","Type":"ContainerStarted","Data":"605873cb9b6b303604275cae6f160069d4f1fab126a34db8722b4e9ee8cb33f6"} Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.468821 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" event={"ID":"50c0879e-d30b-4a8a-972b-10f3188ff06a","Type":"ContainerStarted","Data":"9e8a3ca20d1c4ae120510cc5a272e70fa364b612783d64e89f73f60bc9342358"} Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.493393 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" event={"ID":"53b1878e-6de9-4961-9cf6-4673c09c0412","Type":"ContainerStarted","Data":"d75e94665ae7b872cf11e2d59db82d116797c494399ee055f954a74281a9604d"} Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.495596 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" event={"ID":"e0155f33-dd5a-4c6f-b261-2b7026149e9c","Type":"ContainerStarted","Data":"63689a2991258272e196fb1b407be1eea19ef04f07b1e92407f75750470a8d70"} Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.497443 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" event={"ID":"116b1af4-b71b-46ab-9977-12342c13594e","Type":"ContainerStarted","Data":"178d0eb019b5203ecf1d89c80827c25a144da2adc0fdf5ed7c38785492a73eb7"} Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.502461 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.504738 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.529179 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lvn6q" podStartSLOduration=12.414323446 podStartE2EDuration="41.529158415s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.01436758 +0000 UTC m=+1031.825583689" lastFinishedPulling="2025-12-05 20:29:22.129202549 +0000 UTC m=+1060.940418658" observedRunningTime="2025-12-05 20:29:32.52067523 +0000 UTC m=+1071.331891369" watchObservedRunningTime="2025-12-05 20:29:32.529158415 +0000 UTC m=+1071.340374534" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.542780 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" podStartSLOduration=13.223534584 podStartE2EDuration="41.542760513s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.792692941 +0000 UTC m=+1032.603909050" lastFinishedPulling="2025-12-05 20:29:22.11191886 +0000 UTC m=+1060.923134979" observedRunningTime="2025-12-05 20:29:32.540794499 +0000 UTC m=+1071.352010628" watchObservedRunningTime="2025-12-05 20:29:32.542760513 +0000 UTC m=+1071.353976622" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.581407 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.588476 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.625527 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" podStartSLOduration=13.274607143 podStartE2EDuration="41.625510912s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.792887326 +0000 UTC m=+1032.604103435" lastFinishedPulling="2025-12-05 20:29:22.143791095 +0000 UTC m=+1060.955007204" observedRunningTime="2025-12-05 20:29:32.623431124 +0000 UTC m=+1071.434647233" watchObservedRunningTime="2025-12-05 20:29:32.625510912 +0000 UTC m=+1071.436727021" Dec 05 20:29:32 crc kubenswrapper[4904]: I1205 20:29:32.652159 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-wq6hj" podStartSLOduration=12.337572832 podStartE2EDuration="40.652141462s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.792500065 +0000 UTC m=+1032.603716174" lastFinishedPulling="2025-12-05 20:29:22.107068695 +0000 UTC m=+1060.918284804" observedRunningTime="2025-12-05 20:29:32.65136134 +0000 UTC m=+1071.462577469" watchObservedRunningTime="2025-12-05 20:29:32.652141462 +0000 UTC m=+1071.463357571" Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.519290 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" event={"ID":"4453c00a-291d-4edb-ab56-9b0fdf3b1ea5","Type":"ContainerStarted","Data":"7116ccaa5aa45279f6a265a3daab3b46635527e2d07cc6353758e132f473f368"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.521872 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" event={"ID":"7ae1ca6f-894c-4c12-ab0a-459e97fa442e","Type":"ContainerStarted","Data":"a823d15b7f72487feb025b25f2f24ce785c047c60d21a79d34afdf8181651cd1"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.523033 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" event={"ID":"03744ad0-4115-4d2f-bf2d-5acc45a6d05a","Type":"ContainerStarted","Data":"345a92799aa6718f2020cac73c9513dac3529c509b88c4c73a7496d93a711c79"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.523998 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" event={"ID":"e7d93aaa-3c37-4793-8535-6dcce7bb79b0","Type":"ContainerStarted","Data":"dcbe326096443dec65f93c22ba4868c4b978a95f2f049f02cbf7bb3355a2a09d"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.530117 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" event={"ID":"a2b7747b-01b7-4f12-8748-2661f53078f0","Type":"ContainerStarted","Data":"40cb4e72504fdc0e945b37ef8d1f16204299feda04bd64d50269d923131cba72"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.535593 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" event={"ID":"8ee93c89-5d32-4114-b134-c084359d11ec","Type":"ContainerStarted","Data":"c6a0abadf287a0d633d1077e5f053c51e563bf61b3e41c2c5a56c4a0e7a20244"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.536996 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" event={"ID":"799e9803-03c5-4406-8b50-e59dedc0918d","Type":"ContainerStarted","Data":"1431cee8b200ddf27172a4429717caeed88366818b8745caf01c9e0ec816579a"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.538267 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" event={"ID":"086dd7ac-a5cc-4433-a5a6-0cfd88a69d72","Type":"ContainerStarted","Data":"e8920fef698dd4f390750f029f0300694b3ccf2d3e167df8f49b3379755e922a"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.541571 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" event={"ID":"54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9","Type":"ContainerStarted","Data":"5a851eac5431230c0f8b48bf624fc4edea4a952b828b9ce942cce866003b88bc"} Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.543227 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.548533 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" Dec 05 20:29:33 crc kubenswrapper[4904]: I1205 20:29:33.582660 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9" podStartSLOduration=30.475345174 podStartE2EDuration="41.582642329s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:29:11.004413179 +0000 UTC m=+1049.815629288" lastFinishedPulling="2025-12-05 20:29:22.111710334 +0000 UTC m=+1060.922926443" observedRunningTime="2025-12-05 20:29:33.580076869 +0000 UTC m=+1072.391292998" watchObservedRunningTime="2025-12-05 20:29:33.582642329 +0000 UTC m=+1072.393858438" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.550937 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"528bac05d1388e0a638d4e21540a1e20bd15f6ec7b5238ce57fc2bc9737af1ba"} Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.554344 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" event={"ID":"2035ff23-1caf-4e9e-bc39-46caef0eb07d","Type":"ContainerStarted","Data":"84e02cba7264942c435761c5a1367c2b312ecf15965ac53c2166e44ac43d557d"} Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.555036 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.557002 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" event={"ID":"0fefb560-28a2-4316-9448-8361111d4837","Type":"ContainerStarted","Data":"8baf046d1d2b66f4223bf0a36e8767293eb1ec1e0cc56cb28ce06f3dd79b858d"} Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.830326 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" podStartSLOduration=14.692918892 podStartE2EDuration="42.830309719s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.815591987 +0000 UTC m=+1032.626808096" lastFinishedPulling="2025-12-05 20:29:21.952982824 +0000 UTC m=+1060.764198923" observedRunningTime="2025-12-05 20:29:34.827887531 +0000 UTC m=+1073.639103660" watchObservedRunningTime="2025-12-05 20:29:34.830309719 +0000 UTC m=+1073.641525838" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.834414 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-z4chx" podStartSLOduration=29.690663618 podStartE2EDuration="43.834399652s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.339593684 +0000 UTC m=+1032.150809793" lastFinishedPulling="2025-12-05 20:29:07.483329718 +0000 UTC m=+1046.294545827" observedRunningTime="2025-12-05 20:29:34.594096966 +0000 UTC m=+1073.405313105" watchObservedRunningTime="2025-12-05 20:29:34.834399652 +0000 UTC m=+1073.645615761" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.846328 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" podStartSLOduration=14.813157051 podStartE2EDuration="42.846314193s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.804628522 +0000 UTC m=+1032.615844631" lastFinishedPulling="2025-12-05 20:29:21.837785664 +0000 UTC m=+1060.649001773" observedRunningTime="2025-12-05 20:29:34.842219219 +0000 UTC m=+1073.653435338" watchObservedRunningTime="2025-12-05 20:29:34.846314193 +0000 UTC m=+1073.657530302" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.868510 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" podStartSLOduration=14.675859968 podStartE2EDuration="43.868478228s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:52.979930523 +0000 UTC m=+1031.791146632" lastFinishedPulling="2025-12-05 20:29:22.172548783 +0000 UTC m=+1060.983764892" observedRunningTime="2025-12-05 20:29:34.858920643 +0000 UTC m=+1073.670136762" watchObservedRunningTime="2025-12-05 20:29:34.868478228 +0000 UTC m=+1073.679694347" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.880177 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" podStartSLOduration=15.746170597999999 podStartE2EDuration="42.880156883s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.798009608 +0000 UTC m=+1032.609225717" lastFinishedPulling="2025-12-05 20:29:20.931995893 +0000 UTC m=+1059.743212002" observedRunningTime="2025-12-05 20:29:34.874464875 +0000 UTC m=+1073.685681004" watchObservedRunningTime="2025-12-05 20:29:34.880156883 +0000 UTC m=+1073.691372992" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.894706 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" podStartSLOduration=3.893747116 podStartE2EDuration="43.894684427s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.79304732 +0000 UTC m=+1032.604263429" lastFinishedPulling="2025-12-05 20:29:33.793984631 +0000 UTC m=+1072.605200740" observedRunningTime="2025-12-05 20:29:34.889819891 +0000 UTC m=+1073.701036020" watchObservedRunningTime="2025-12-05 20:29:34.894684427 +0000 UTC m=+1073.705900536" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.910413 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" podStartSLOduration=14.369908348 podStartE2EDuration="42.910392903s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.568160454 +0000 UTC m=+1032.379376563" lastFinishedPulling="2025-12-05 20:29:22.108645009 +0000 UTC m=+1060.919861118" observedRunningTime="2025-12-05 20:29:34.903412879 +0000 UTC m=+1073.714629008" watchObservedRunningTime="2025-12-05 20:29:34.910392903 +0000 UTC m=+1073.721609012" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.926585 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" podStartSLOduration=15.533668586 podStartE2EDuration="43.926567902s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.73294532 +0000 UTC m=+1032.544161429" lastFinishedPulling="2025-12-05 20:29:22.125844636 +0000 UTC m=+1060.937060745" observedRunningTime="2025-12-05 20:29:34.924036502 +0000 UTC m=+1073.735252631" watchObservedRunningTime="2025-12-05 20:29:34.926567902 +0000 UTC m=+1073.737784021" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.942361 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" podStartSLOduration=2.669815286 podStartE2EDuration="42.942343891s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.733352932 +0000 UTC m=+1032.544569041" lastFinishedPulling="2025-12-05 20:29:34.005881517 +0000 UTC m=+1072.817097646" observedRunningTime="2025-12-05 20:29:34.939133662 +0000 UTC m=+1073.750349801" watchObservedRunningTime="2025-12-05 20:29:34.942343891 +0000 UTC m=+1073.753559990" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.955077 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l65z8" podStartSLOduration=29.247868847 podStartE2EDuration="42.955043503s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.77719206 +0000 UTC m=+1032.588408169" lastFinishedPulling="2025-12-05 20:29:07.484366716 +0000 UTC m=+1046.295582825" observedRunningTime="2025-12-05 20:29:34.953606553 +0000 UTC m=+1073.764822692" watchObservedRunningTime="2025-12-05 20:29:34.955043503 +0000 UTC m=+1073.766259612" Dec 05 20:29:34 crc kubenswrapper[4904]: I1205 20:29:34.986587 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-n6d4m" podStartSLOduration=29.293276928 podStartE2EDuration="42.986559189s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.792192127 +0000 UTC m=+1032.603408226" lastFinishedPulling="2025-12-05 20:29:07.485474378 +0000 UTC m=+1046.296690487" observedRunningTime="2025-12-05 20:29:34.983817523 +0000 UTC m=+1073.795033652" watchObservedRunningTime="2025-12-05 20:29:34.986559189 +0000 UTC m=+1073.797775298" Dec 05 20:29:35 crc kubenswrapper[4904]: I1205 20:29:35.006984 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" podStartSLOduration=14.763220844 podStartE2EDuration="43.006959166s" podCreationTimestamp="2025-12-05 20:28:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.806244097 +0000 UTC m=+1032.617460206" lastFinishedPulling="2025-12-05 20:29:22.049982419 +0000 UTC m=+1060.861198528" observedRunningTime="2025-12-05 20:29:34.999631762 +0000 UTC m=+1073.810847891" watchObservedRunningTime="2025-12-05 20:29:35.006959166 +0000 UTC m=+1073.818175275" Dec 05 20:29:35 crc kubenswrapper[4904]: I1205 20:29:35.018128 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7f864" podStartSLOduration=29.869330990999998 podStartE2EDuration="44.018110135s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.336899619 +0000 UTC m=+1032.148115738" lastFinishedPulling="2025-12-05 20:29:07.485678783 +0000 UTC m=+1046.296894882" observedRunningTime="2025-12-05 20:29:35.014403213 +0000 UTC m=+1073.825619322" watchObservedRunningTime="2025-12-05 20:29:35.018110135 +0000 UTC m=+1073.829326244" Dec 05 20:29:35 crc kubenswrapper[4904]: I1205 20:29:35.038231 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" podStartSLOduration=32.925357383 podStartE2EDuration="44.038208393s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:29:11.01348405 +0000 UTC m=+1049.824700159" lastFinishedPulling="2025-12-05 20:29:22.12633507 +0000 UTC m=+1060.937551169" observedRunningTime="2025-12-05 20:29:35.032920176 +0000 UTC m=+1073.844136315" watchObservedRunningTime="2025-12-05 20:29:35.038208393 +0000 UTC m=+1073.849424502" Dec 05 20:29:35 crc kubenswrapper[4904]: I1205 20:29:35.057337 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" podStartSLOduration=15.783316143 podStartE2EDuration="44.057320385s" podCreationTimestamp="2025-12-05 20:28:51 +0000 UTC" firstStartedPulling="2025-12-05 20:28:53.793166894 +0000 UTC m=+1032.604383003" lastFinishedPulling="2025-12-05 20:29:22.067171136 +0000 UTC m=+1060.878387245" observedRunningTime="2025-12-05 20:29:35.051625016 +0000 UTC m=+1073.862841135" watchObservedRunningTime="2025-12-05 20:29:35.057320385 +0000 UTC m=+1073.868536494" Dec 05 20:29:38 crc kubenswrapper[4904]: I1205 20:29:38.257925 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:29:38 crc kubenswrapper[4904]: I1205 20:29:38.266760 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4zb86" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.242884 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.245573 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lc48g" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.349273 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.351940 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-9zhph" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.444362 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.445133 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.447405 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.449318 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jrrnh" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.449636 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-h9wsr" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.456728 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xd4c5" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.522629 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.529693 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xt4ph" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.677038 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xv5gb" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.736343 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.739075 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-f2bbb" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.782118 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.787275 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gv6jv" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.825884 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:29:42 crc kubenswrapper[4904]: I1205 20:29:42.827668 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-xqdzf" Dec 05 20:29:43 crc kubenswrapper[4904]: I1205 20:29:43.034675 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:29:43 crc kubenswrapper[4904]: I1205 20:29:43.036598 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-kr7mw" Dec 05 20:29:43 crc kubenswrapper[4904]: I1205 20:29:43.163099 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:29:43 crc kubenswrapper[4904]: I1205 20:29:43.164914 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-pjwn4" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.153147 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5"] Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.155433 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.157722 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.157849 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.167449 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5"] Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.312176 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l96rq\" (UniqueName: \"kubernetes.io/projected/f1593c4e-1330-4e15-a120-f132191a52ab-kube-api-access-l96rq\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.312262 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1593c4e-1330-4e15-a120-f132191a52ab-config-volume\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.312304 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1593c4e-1330-4e15-a120-f132191a52ab-secret-volume\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.413531 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l96rq\" (UniqueName: \"kubernetes.io/projected/f1593c4e-1330-4e15-a120-f132191a52ab-kube-api-access-l96rq\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.413628 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1593c4e-1330-4e15-a120-f132191a52ab-config-volume\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.413687 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1593c4e-1330-4e15-a120-f132191a52ab-secret-volume\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.414937 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1593c4e-1330-4e15-a120-f132191a52ab-config-volume\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.419309 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1593c4e-1330-4e15-a120-f132191a52ab-secret-volume\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.445876 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l96rq\" (UniqueName: \"kubernetes.io/projected/f1593c4e-1330-4e15-a120-f132191a52ab-kube-api-access-l96rq\") pod \"collect-profiles-29416110-8hdb5\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:00 crc kubenswrapper[4904]: I1205 20:30:00.491095 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:01 crc kubenswrapper[4904]: I1205 20:30:01.218178 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5"] Dec 05 20:30:01 crc kubenswrapper[4904]: I1205 20:30:01.831343 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" event={"ID":"f1593c4e-1330-4e15-a120-f132191a52ab","Type":"ContainerStarted","Data":"8150edccd0c7bb34cc52733696495fd735b6bb3cac23ebe074ebf870d0feebab"} Dec 05 20:30:02 crc kubenswrapper[4904]: I1205 20:30:02.841869 4904 generic.go:334] "Generic (PLEG): container finished" podID="f1593c4e-1330-4e15-a120-f132191a52ab" containerID="2757697cb82e3ebb38bbfaca5779caec129016b4d520b956c2a36179d5e0f826" exitCode=0 Dec 05 20:30:02 crc kubenswrapper[4904]: I1205 20:30:02.842034 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" event={"ID":"f1593c4e-1330-4e15-a120-f132191a52ab","Type":"ContainerDied","Data":"2757697cb82e3ebb38bbfaca5779caec129016b4d520b956c2a36179d5e0f826"} Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.154940 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.269562 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1593c4e-1330-4e15-a120-f132191a52ab-config-volume\") pod \"f1593c4e-1330-4e15-a120-f132191a52ab\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.269778 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l96rq\" (UniqueName: \"kubernetes.io/projected/f1593c4e-1330-4e15-a120-f132191a52ab-kube-api-access-l96rq\") pod \"f1593c4e-1330-4e15-a120-f132191a52ab\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.269862 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1593c4e-1330-4e15-a120-f132191a52ab-secret-volume\") pod \"f1593c4e-1330-4e15-a120-f132191a52ab\" (UID: \"f1593c4e-1330-4e15-a120-f132191a52ab\") " Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.270565 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1593c4e-1330-4e15-a120-f132191a52ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "f1593c4e-1330-4e15-a120-f132191a52ab" (UID: "f1593c4e-1330-4e15-a120-f132191a52ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.276831 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1593c4e-1330-4e15-a120-f132191a52ab-kube-api-access-l96rq" (OuterVolumeSpecName: "kube-api-access-l96rq") pod "f1593c4e-1330-4e15-a120-f132191a52ab" (UID: "f1593c4e-1330-4e15-a120-f132191a52ab"). InnerVolumeSpecName "kube-api-access-l96rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.277528 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1593c4e-1330-4e15-a120-f132191a52ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f1593c4e-1330-4e15-a120-f132191a52ab" (UID: "f1593c4e-1330-4e15-a120-f132191a52ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.372092 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l96rq\" (UniqueName: \"kubernetes.io/projected/f1593c4e-1330-4e15-a120-f132191a52ab-kube-api-access-l96rq\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.372133 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1593c4e-1330-4e15-a120-f132191a52ab-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.372145 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1593c4e-1330-4e15-a120-f132191a52ab-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.858531 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" event={"ID":"f1593c4e-1330-4e15-a120-f132191a52ab","Type":"ContainerDied","Data":"8150edccd0c7bb34cc52733696495fd735b6bb3cac23ebe074ebf870d0feebab"} Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.858593 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8150edccd0c7bb34cc52733696495fd735b6bb3cac23ebe074ebf870d0feebab" Dec 05 20:30:04 crc kubenswrapper[4904]: I1205 20:30:04.858676 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.206288 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75f87779c-99qqw"] Dec 05 20:30:05 crc kubenswrapper[4904]: E1205 20:30:05.206664 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1593c4e-1330-4e15-a120-f132191a52ab" containerName="collect-profiles" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.206679 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1593c4e-1330-4e15-a120-f132191a52ab" containerName="collect-profiles" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.206879 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1593c4e-1330-4e15-a120-f132191a52ab" containerName="collect-profiles" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.207852 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.209863 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.210235 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kmt5v" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.210283 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.210392 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.225974 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f87779c-99qqw"] Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.286079 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ba532-e24b-42d1-8084-ff87bf54e167-config\") pod \"dnsmasq-dns-75f87779c-99qqw\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.286237 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qngrb\" (UniqueName: \"kubernetes.io/projected/8c4ba532-e24b-42d1-8084-ff87bf54e167-kube-api-access-qngrb\") pod \"dnsmasq-dns-75f87779c-99qqw\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.298895 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586ffd88f7-d6cls"] Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.300010 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.302133 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.317116 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586ffd88f7-d6cls"] Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.387571 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qngrb\" (UniqueName: \"kubernetes.io/projected/8c4ba532-e24b-42d1-8084-ff87bf54e167-kube-api-access-qngrb\") pod \"dnsmasq-dns-75f87779c-99qqw\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.387636 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-dns-svc\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.387667 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ba532-e24b-42d1-8084-ff87bf54e167-config\") pod \"dnsmasq-dns-75f87779c-99qqw\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.387691 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mm2\" (UniqueName: \"kubernetes.io/projected/11cdf31c-28af-478e-8230-dc069aa40be5-kube-api-access-n7mm2\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.387717 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-config\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.388847 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ba532-e24b-42d1-8084-ff87bf54e167-config\") pod \"dnsmasq-dns-75f87779c-99qqw\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.422943 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qngrb\" (UniqueName: \"kubernetes.io/projected/8c4ba532-e24b-42d1-8084-ff87bf54e167-kube-api-access-qngrb\") pod \"dnsmasq-dns-75f87779c-99qqw\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.489291 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-dns-svc\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.489355 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7mm2\" (UniqueName: \"kubernetes.io/projected/11cdf31c-28af-478e-8230-dc069aa40be5-kube-api-access-n7mm2\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.489387 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-config\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.490467 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-config\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.490995 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-dns-svc\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.515908 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7mm2\" (UniqueName: \"kubernetes.io/projected/11cdf31c-28af-478e-8230-dc069aa40be5-kube-api-access-n7mm2\") pod \"dnsmasq-dns-586ffd88f7-d6cls\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.528798 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:05 crc kubenswrapper[4904]: I1205 20:30:05.621936 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:06 crc kubenswrapper[4904]: W1205 20:30:06.140945 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c4ba532_e24b_42d1_8084_ff87bf54e167.slice/crio-94609c9d61e0309bd13df4fbc1d436827b4529403857337be5bce29a9f711555 WatchSource:0}: Error finding container 94609c9d61e0309bd13df4fbc1d436827b4529403857337be5bce29a9f711555: Status 404 returned error can't find the container with id 94609c9d61e0309bd13df4fbc1d436827b4529403857337be5bce29a9f711555 Dec 05 20:30:06 crc kubenswrapper[4904]: I1205 20:30:06.142031 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f87779c-99qqw"] Dec 05 20:30:06 crc kubenswrapper[4904]: I1205 20:30:06.211351 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586ffd88f7-d6cls"] Dec 05 20:30:06 crc kubenswrapper[4904]: I1205 20:30:06.879917 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f87779c-99qqw" event={"ID":"8c4ba532-e24b-42d1-8084-ff87bf54e167","Type":"ContainerStarted","Data":"94609c9d61e0309bd13df4fbc1d436827b4529403857337be5bce29a9f711555"} Dec 05 20:30:06 crc kubenswrapper[4904]: I1205 20:30:06.882673 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" event={"ID":"11cdf31c-28af-478e-8230-dc069aa40be5","Type":"ContainerStarted","Data":"69fb41eaaf1f3fa5d524d9b937a4e01cc65186f447d1f9602050b1ec8eddf111"} Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.122857 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586ffd88f7-d6cls"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.140469 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bccbb886f-2pd5r"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.141716 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.144226 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbz5d\" (UniqueName: \"kubernetes.io/projected/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-kube-api-access-cbz5d\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.144279 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-dns-svc\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.144357 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-config\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.157461 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bccbb886f-2pd5r"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.245358 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbz5d\" (UniqueName: \"kubernetes.io/projected/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-kube-api-access-cbz5d\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.245400 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-dns-svc\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.245458 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-config\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.246534 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-config\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.246562 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-dns-svc\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.271180 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbz5d\" (UniqueName: \"kubernetes.io/projected/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-kube-api-access-cbz5d\") pod \"dnsmasq-dns-6bccbb886f-2pd5r\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.459612 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f87779c-99qqw"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.464278 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.484842 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559648544f-dznst"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.486170 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.504807 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559648544f-dznst"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.549857 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-config\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.549920 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cqb\" (UniqueName: \"kubernetes.io/projected/9c395630-7b9e-4495-8357-f8cf91879288-kube-api-access-m7cqb\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.549967 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-dns-svc\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.650308 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-dns-svc\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.650589 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-config\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.650636 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cqb\" (UniqueName: \"kubernetes.io/projected/9c395630-7b9e-4495-8357-f8cf91879288-kube-api-access-m7cqb\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.651450 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-dns-svc\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.652786 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-config\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.673245 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cqb\" (UniqueName: \"kubernetes.io/projected/9c395630-7b9e-4495-8357-f8cf91879288-kube-api-access-m7cqb\") pod \"dnsmasq-dns-559648544f-dznst\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.854078 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.886990 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559648544f-dznst"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.915286 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d9656c78f-lhs4v"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.918317 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.934793 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d9656c78f-lhs4v"] Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.998139 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9t85\" (UniqueName: \"kubernetes.io/projected/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-kube-api-access-n9t85\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.998244 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-dns-svc\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:09 crc kubenswrapper[4904]: I1205 20:30:09.998287 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-config\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.102184 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-config\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.102487 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9t85\" (UniqueName: \"kubernetes.io/projected/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-kube-api-access-n9t85\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.102550 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-dns-svc\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.103585 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-dns-svc\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.104285 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-config\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.135795 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9t85\" (UniqueName: \"kubernetes.io/projected/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-kube-api-access-n9t85\") pod \"dnsmasq-dns-6d9656c78f-lhs4v\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.190318 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bccbb886f-2pd5r"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.259356 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.307066 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.308280 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.312450 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.312702 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.316450 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.316719 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.317270 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.317401 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.319391 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9glvh" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.350033 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.404761 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559648544f-dznst"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.421957 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.421992 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422014 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422071 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422118 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbsl2\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-kube-api-access-cbsl2\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422150 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422169 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422183 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-config-data\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422203 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.422224 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: W1205 20:30:10.428938 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c395630_7b9e_4495_8357_f8cf91879288.slice/crio-7bcff980c0ad20a1eb7fbaa36a9672cb2fcd8c7af6b549d1175ebaeef39eed39 WatchSource:0}: Error finding container 7bcff980c0ad20a1eb7fbaa36a9672cb2fcd8c7af6b549d1175ebaeef39eed39: Status 404 returned error can't find the container with id 7bcff980c0ad20a1eb7fbaa36a9672cb2fcd8c7af6b549d1175ebaeef39eed39 Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523276 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523321 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523343 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-config-data\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523364 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523389 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523409 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523423 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523441 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523480 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523500 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.523527 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbsl2\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-kube-api-access-cbsl2\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.526310 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.526555 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.527336 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.527729 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-config-data\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.528400 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.528638 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.535813 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.537177 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.551238 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.553306 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.557038 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbsl2\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-kube-api-access-cbsl2\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.567176 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.643589 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.644949 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.654442 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.663465 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.663647 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.663672 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.663750 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.663852 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.665160 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.665911 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-c25p2" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.667607 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727326 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727387 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727408 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727426 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ad24986-23a3-4010-8dcf-6778339691c8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727446 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727462 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727478 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-269gx\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-kube-api-access-269gx\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727504 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727523 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ad24986-23a3-4010-8dcf-6778339691c8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727551 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.727570 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.828960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829032 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ad24986-23a3-4010-8dcf-6778339691c8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829125 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829153 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829199 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829258 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829284 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829310 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ad24986-23a3-4010-8dcf-6778339691c8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829338 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829364 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.829385 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-269gx\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-kube-api-access-269gx\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.830458 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.834640 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.835855 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.837162 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.837241 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ad24986-23a3-4010-8dcf-6778339691c8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.837858 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.839229 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.839240 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.839776 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.842711 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ad24986-23a3-4010-8dcf-6778339691c8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.846308 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-269gx\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-kube-api-access-269gx\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.876611 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.901698 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d9656c78f-lhs4v"] Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.926959 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" event={"ID":"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca","Type":"ContainerStarted","Data":"d32dd19148976f7b6767803164ee2ccd3ef21cb083f0ba5f62355241f224b868"} Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.928233 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" event={"ID":"d51f02d6-e4c4-4357-8d5d-6478f452a4e1","Type":"ContainerStarted","Data":"2e85a43bf6af16e73c78b877d07c2abc7d99cf1c71cb815bf6d1b21c2e57321a"} Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.930994 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559648544f-dznst" event={"ID":"9c395630-7b9e-4495-8357-f8cf91879288","Type":"ContainerStarted","Data":"7bcff980c0ad20a1eb7fbaa36a9672cb2fcd8c7af6b549d1175ebaeef39eed39"} Dec 05 20:30:10 crc kubenswrapper[4904]: I1205 20:30:10.979953 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.041257 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.042901 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.046301 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.046428 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.046503 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.046529 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.046831 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-89wjg" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.047550 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.048269 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.054593 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138253 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcklp\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-kube-api-access-dcklp\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138288 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138334 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138366 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138396 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138413 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138438 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138507 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138535 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138567 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.138585 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.233874 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.240995 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241041 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241094 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241120 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241153 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241173 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241198 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241284 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcklp\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-kube-api-access-dcklp\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241317 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241347 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241362 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.241940 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.242125 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.242136 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.242683 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.243156 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.247532 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.249791 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.249854 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.250448 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.256507 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.262835 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcklp\" (UniqueName: \"kubernetes.io/projected/0caaad94-d02e-43da-bf3b-087a5ec8d2f8-kube-api-access-dcklp\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.291286 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"0caaad94-d02e-43da-bf3b-087a5ec8d2f8\") " pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.367984 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.612188 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.910476 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.965909 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ad24986-23a3-4010-8dcf-6778339691c8","Type":"ContainerStarted","Data":"a658b6fddf2b66d38dd44fc30b45f3a866ca1f7b23a859f0ae3683115ca85024"} Dec 05 20:30:11 crc kubenswrapper[4904]: I1205 20:30:11.967765 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"921f0ddc-4d15-4bfa-9560-7a01eaa3461f","Type":"ContainerStarted","Data":"38861b6e79d90e746aa66c2f536a6eb4525ce0b4c93b922e4ad66d0d0ae9261e"} Dec 05 20:30:11 crc kubenswrapper[4904]: W1205 20:30:11.986672 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0caaad94_d02e_43da_bf3b_087a5ec8d2f8.slice/crio-fdb1f3c04127a40e62fcc9b2940dde06638927c4e3fe08b0581f5bf64c369b54 WatchSource:0}: Error finding container fdb1f3c04127a40e62fcc9b2940dde06638927c4e3fe08b0581f5bf64c369b54: Status 404 returned error can't find the container with id fdb1f3c04127a40e62fcc9b2940dde06638927c4e3fe08b0581f5bf64c369b54 Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.447441 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.449033 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.455154 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.455346 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bfrhg" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.455561 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.455615 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.458840 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.463083 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.558828 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.558911 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e08506d4-1ca7-4932-b73f-21020cb20578-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.558990 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08506d4-1ca7-4932-b73f-21020cb20578-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.559047 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.559156 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-config-data-default\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.559179 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-kolla-config\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.559243 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jn6p\" (UniqueName: \"kubernetes.io/projected/e08506d4-1ca7-4932-b73f-21020cb20578-kube-api-access-2jn6p\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.559303 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08506d4-1ca7-4932-b73f-21020cb20578-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660396 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660451 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e08506d4-1ca7-4932-b73f-21020cb20578-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660498 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08506d4-1ca7-4932-b73f-21020cb20578-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660516 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660549 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-config-data-default\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660568 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-kolla-config\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660583 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jn6p\" (UniqueName: \"kubernetes.io/projected/e08506d4-1ca7-4932-b73f-21020cb20578-kube-api-access-2jn6p\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.660611 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08506d4-1ca7-4932-b73f-21020cb20578-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.661641 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.661830 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e08506d4-1ca7-4932-b73f-21020cb20578-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.662828 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-kolla-config\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.662865 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-config-data-default\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.666303 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e08506d4-1ca7-4932-b73f-21020cb20578-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.673974 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08506d4-1ca7-4932-b73f-21020cb20578-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.678158 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08506d4-1ca7-4932-b73f-21020cb20578-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.689412 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.693588 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jn6p\" (UniqueName: \"kubernetes.io/projected/e08506d4-1ca7-4932-b73f-21020cb20578-kube-api-access-2jn6p\") pod \"openstack-galera-0\" (UID: \"e08506d4-1ca7-4932-b73f-21020cb20578\") " pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.781841 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 20:30:12 crc kubenswrapper[4904]: I1205 20:30:12.998879 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"0caaad94-d02e-43da-bf3b-087a5ec8d2f8","Type":"ContainerStarted","Data":"fdb1f3c04127a40e62fcc9b2940dde06638927c4e3fe08b0581f5bf64c369b54"} Dec 05 20:30:13 crc kubenswrapper[4904]: W1205 20:30:13.316266 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08506d4_1ca7_4932_b73f_21020cb20578.slice/crio-e3771fefe5f1c1bbdbb60f5c5ccb588a422486f19368914ee04695a465ef9b67 WatchSource:0}: Error finding container e3771fefe5f1c1bbdbb60f5c5ccb588a422486f19368914ee04695a465ef9b67: Status 404 returned error can't find the container with id e3771fefe5f1c1bbdbb60f5c5ccb588a422486f19368914ee04695a465ef9b67 Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.326389 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.831458 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.832885 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.837621 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.837861 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.838152 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-b5cr7" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.838323 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.858414 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982245 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982576 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982599 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3437a493-1ffe-49dc-a789-3451b2f87204-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982624 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3437a493-1ffe-49dc-a789-3451b2f87204-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982642 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3437a493-1ffe-49dc-a789-3451b2f87204-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982660 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5v2\" (UniqueName: \"kubernetes.io/projected/3437a493-1ffe-49dc-a789-3451b2f87204-kube-api-access-fl5v2\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982699 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:13 crc kubenswrapper[4904]: I1205 20:30:13.982738 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.030788 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.032337 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.040803 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.041318 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-248rq" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.041432 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.056240 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.072104 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e08506d4-1ca7-4932-b73f-21020cb20578","Type":"ContainerStarted","Data":"e3771fefe5f1c1bbdbb60f5c5ccb588a422486f19368914ee04695a465ef9b67"} Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084725 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084810 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084894 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084920 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084941 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3437a493-1ffe-49dc-a789-3451b2f87204-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084973 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3437a493-1ffe-49dc-a789-3451b2f87204-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.084998 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3437a493-1ffe-49dc-a789-3451b2f87204-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.085024 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5v2\" (UniqueName: \"kubernetes.io/projected/3437a493-1ffe-49dc-a789-3451b2f87204-kube-api-access-fl5v2\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.085924 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.090731 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.091173 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.091498 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3437a493-1ffe-49dc-a789-3451b2f87204-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.091806 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3437a493-1ffe-49dc-a789-3451b2f87204-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.097348 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3437a493-1ffe-49dc-a789-3451b2f87204-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.113877 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5v2\" (UniqueName: \"kubernetes.io/projected/3437a493-1ffe-49dc-a789-3451b2f87204-kube-api-access-fl5v2\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.122803 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3437a493-1ffe-49dc-a789-3451b2f87204-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.144161 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3437a493-1ffe-49dc-a789-3451b2f87204\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.176707 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.187124 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/980ec67d-9dc9-4cae-8169-9890a40d65c3-kolla-config\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.187189 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/980ec67d-9dc9-4cae-8169-9890a40d65c3-config-data\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.187219 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/980ec67d-9dc9-4cae-8169-9890a40d65c3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.187384 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980ec67d-9dc9-4cae-8169-9890a40d65c3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.187408 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62lcg\" (UniqueName: \"kubernetes.io/projected/980ec67d-9dc9-4cae-8169-9890a40d65c3-kube-api-access-62lcg\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.291555 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/980ec67d-9dc9-4cae-8169-9890a40d65c3-config-data\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.291625 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/980ec67d-9dc9-4cae-8169-9890a40d65c3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.291653 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980ec67d-9dc9-4cae-8169-9890a40d65c3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.291677 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62lcg\" (UniqueName: \"kubernetes.io/projected/980ec67d-9dc9-4cae-8169-9890a40d65c3-kube-api-access-62lcg\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.291779 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/980ec67d-9dc9-4cae-8169-9890a40d65c3-kolla-config\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.293435 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/980ec67d-9dc9-4cae-8169-9890a40d65c3-kolla-config\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.294225 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/980ec67d-9dc9-4cae-8169-9890a40d65c3-config-data\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.301609 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980ec67d-9dc9-4cae-8169-9890a40d65c3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.329342 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62lcg\" (UniqueName: \"kubernetes.io/projected/980ec67d-9dc9-4cae-8169-9890a40d65c3-kube-api-access-62lcg\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.347302 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/980ec67d-9dc9-4cae-8169-9890a40d65c3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"980ec67d-9dc9-4cae-8169-9890a40d65c3\") " pod="openstack/memcached-0" Dec 05 20:30:14 crc kubenswrapper[4904]: I1205 20:30:14.365769 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.500766 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.501879 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.508469 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-775d7" Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.523931 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.684517 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bq9d\" (UniqueName: \"kubernetes.io/projected/1893f1e0-90ea-4bd5-9275-c0266042485d-kube-api-access-8bq9d\") pod \"kube-state-metrics-0\" (UID: \"1893f1e0-90ea-4bd5-9275-c0266042485d\") " pod="openstack/kube-state-metrics-0" Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.891391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bq9d\" (UniqueName: \"kubernetes.io/projected/1893f1e0-90ea-4bd5-9275-c0266042485d-kube-api-access-8bq9d\") pod \"kube-state-metrics-0\" (UID: \"1893f1e0-90ea-4bd5-9275-c0266042485d\") " pod="openstack/kube-state-metrics-0" Dec 05 20:30:16 crc kubenswrapper[4904]: I1205 20:30:16.927114 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bq9d\" (UniqueName: \"kubernetes.io/projected/1893f1e0-90ea-4bd5-9275-c0266042485d-kube-api-access-8bq9d\") pod \"kube-state-metrics-0\" (UID: \"1893f1e0-90ea-4bd5-9275-c0266042485d\") " pod="openstack/kube-state-metrics-0" Dec 05 20:30:17 crc kubenswrapper[4904]: I1205 20:30:17.194052 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.010441 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.023744 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.042840 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.043215 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-c2mnb" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.045714 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.045938 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.045378 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.064297 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.075781 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.139793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qcr\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-kube-api-access-m6qcr\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.139843 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.139871 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.139908 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.139950 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.139987 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.140004 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.140027 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.245960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246005 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246035 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246080 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qcr\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-kube-api-access-m6qcr\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246101 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246126 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246160 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.246203 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.247023 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.264233 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.264418 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.265102 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.278568 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.279560 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.294864 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qcr\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-kube-api-access-m6qcr\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.313829 4904 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.313875 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c2522b7d526507cd9f6194376dadc5aee47822a6206b438630738242eaba537/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.392783 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:18 crc kubenswrapper[4904]: I1205 20:30:18.664643 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.135653 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hlxgq"] Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.136919 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.139647 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.139689 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kkzjt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.141691 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.143031 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-h7djt"] Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.144868 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.155502 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlxgq"] Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.155658 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-h7djt"] Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194667 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-log-ovn\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194715 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ef8df6-e0e1-45a5-954d-10ce99fa26de-ovn-controller-tls-certs\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194796 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-run\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194812 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-run-ovn\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194835 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdgr\" (UniqueName: \"kubernetes.io/projected/41ef8df6-e0e1-45a5-954d-10ce99fa26de-kube-api-access-6wdgr\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194853 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef8df6-e0e1-45a5-954d-10ce99fa26de-combined-ca-bundle\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.194868 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ef8df6-e0e1-45a5-954d-10ce99fa26de-scripts\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.296747 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ef8df6-e0e1-45a5-954d-10ce99fa26de-ovn-controller-tls-certs\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297122 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-log\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297237 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-run\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297275 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-lib\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297295 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3db65f91-e650-49d5-b372-cabc44efff3f-scripts\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297369 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7sr\" (UniqueName: \"kubernetes.io/projected/3db65f91-e650-49d5-b372-cabc44efff3f-kube-api-access-qq7sr\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-run\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297408 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-run-ovn\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297453 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-etc-ovs\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297471 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdgr\" (UniqueName: \"kubernetes.io/projected/41ef8df6-e0e1-45a5-954d-10ce99fa26de-kube-api-access-6wdgr\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297536 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef8df6-e0e1-45a5-954d-10ce99fa26de-combined-ca-bundle\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297554 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ef8df6-e0e1-45a5-954d-10ce99fa26de-scripts\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.297601 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-log-ovn\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.298109 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-run\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.298708 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-log-ovn\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.299478 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41ef8df6-e0e1-45a5-954d-10ce99fa26de-var-run-ovn\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.306980 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ef8df6-e0e1-45a5-954d-10ce99fa26de-ovn-controller-tls-certs\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.307323 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef8df6-e0e1-45a5-954d-10ce99fa26de-combined-ca-bundle\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.321015 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ef8df6-e0e1-45a5-954d-10ce99fa26de-scripts\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.325920 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdgr\" (UniqueName: \"kubernetes.io/projected/41ef8df6-e0e1-45a5-954d-10ce99fa26de-kube-api-access-6wdgr\") pod \"ovn-controller-hlxgq\" (UID: \"41ef8df6-e0e1-45a5-954d-10ce99fa26de\") " pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.401888 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-lib\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.401927 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3db65f91-e650-49d5-b372-cabc44efff3f-scripts\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.401999 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7sr\" (UniqueName: \"kubernetes.io/projected/3db65f91-e650-49d5-b372-cabc44efff3f-kube-api-access-qq7sr\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402050 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-etc-ovs\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402203 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-lib\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402233 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-log\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402294 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-run\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402438 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-run\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402822 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-etc-ovs\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.402955 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3db65f91-e650-49d5-b372-cabc44efff3f-var-log\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.407162 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3db65f91-e650-49d5-b372-cabc44efff3f-scripts\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.419479 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7sr\" (UniqueName: \"kubernetes.io/projected/3db65f91-e650-49d5-b372-cabc44efff3f-kube-api-access-qq7sr\") pod \"ovn-controller-ovs-h7djt\" (UID: \"3db65f91-e650-49d5-b372-cabc44efff3f\") " pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.489332 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:20 crc kubenswrapper[4904]: I1205 20:30:20.498544 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.322964 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.325995 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.335251 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.335436 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.335600 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-d9hpd" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.335771 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.336231 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.336433 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.454976 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455081 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455129 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455155 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-882lx\" (UniqueName: \"kubernetes.io/projected/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-kube-api-access-882lx\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455344 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-config\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455392 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455460 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.455536 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.510136 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.511449 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.516291 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ft777" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.516354 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.516446 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.516442 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.532521 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556537 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556597 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556630 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556649 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-882lx\" (UniqueName: \"kubernetes.io/projected/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-kube-api-access-882lx\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556690 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-config\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556705 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556729 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.556756 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.557956 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.558212 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.558496 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.563500 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-config\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.566486 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.566571 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.576721 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.601991 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-882lx\" (UniqueName: \"kubernetes.io/projected/c9f105e3-0b5a-435f-bc00-fcfd7eceaafd-kube-api-access-882lx\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.617891 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.657967 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658011 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658031 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-config\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658052 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658081 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658109 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658139 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q79\" (UniqueName: \"kubernetes.io/projected/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-kube-api-access-c5q79\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.658156 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.670029 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759283 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q79\" (UniqueName: \"kubernetes.io/projected/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-kube-api-access-c5q79\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759332 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759455 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759488 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759510 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-config\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759534 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759558 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.759596 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.760511 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-config\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.760696 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.760817 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.762329 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.763112 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.764487 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.773305 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.776549 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q79\" (UniqueName: \"kubernetes.io/projected/9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0-kube-api-access-c5q79\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.787714 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:23 crc kubenswrapper[4904]: I1205 20:30:23.867736 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:41 crc kubenswrapper[4904]: E1205 20:30:41.809162 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Dec 05 20:30:41 crc kubenswrapper[4904]: E1205 20:30:41.809751 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Dec 05 20:30:41 crc kubenswrapper[4904]: E1205 20:30:41.809923 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbsl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(921f0ddc-4d15-4bfa-9560-7a01eaa3461f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:41 crc kubenswrapper[4904]: E1205 20:30:41.811619 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" Dec 05 20:30:42 crc kubenswrapper[4904]: E1205 20:30:42.496853 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest\\\"\"" pod="openstack/rabbitmq-server-0" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.079534 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-mariadb:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.079882 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-mariadb:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.080009 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:38.102.83.9:5001/podified-master-centos10/openstack-mariadb:watcher_latest,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jn6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(e08506d4-1ca7-4932-b73f-21020cb20578): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.081442 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="e08506d4-1ca7-4932-b73f-21020cb20578" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.110030 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.110104 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.110233 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-269gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0ad24986-23a3-4010-8dcf-6778339691c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.111585 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.135126 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.135187 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.135323 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbz5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6bccbb886f-2pd5r_openstack(d51f02d6-e4c4-4357-8d5d-6478f452a4e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.136763 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" podUID="d51f02d6-e4c4-4357-8d5d-6478f452a4e1" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.159117 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.159169 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.159268 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qngrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75f87779c-99qqw_openstack(8c4ba532-e24b-42d1-8084-ff87bf54e167): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.160520 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-75f87779c-99qqw" podUID="8c4ba532-e24b-42d1-8084-ff87bf54e167" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.187159 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.187210 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.187306 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7cqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-559648544f-dznst_openstack(9c395630-7b9e-4495-8357-f8cf91879288): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.190525 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-559648544f-dznst" podUID="9c395630-7b9e-4495-8357-f8cf91879288" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.193302 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.193356 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.193499 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dcklp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_openstack(0caaad94-d02e-43da-bf3b-087a5ec8d2f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.193731 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.193749 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.193825 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n7mm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-586ffd88f7-d6cls_openstack(11cdf31c-28af-478e-8230-dc069aa40be5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.196085 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" podUID="11cdf31c-28af-478e-8230-dc069aa40be5" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.196165 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-notifications-server-0" podUID="0caaad94-d02e-43da-bf3b-087a5ec8d2f8" Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.521873 4904 generic.go:334] "Generic (PLEG): container finished" podID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerID="5ac5f16656517316de02f88acea225273b85b2902b89b16f7c05a75a542c2c57" exitCode=0 Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.523490 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" event={"ID":"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca","Type":"ContainerDied","Data":"5ac5f16656517316de02f88acea225273b85b2902b89b16f7c05a75a542c2c57"} Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.524662 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest\\\"\"" pod="openstack/rabbitmq-notifications-server-0" podUID="0caaad94-d02e-43da-bf3b-087a5ec8d2f8" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.524899 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-mariadb:watcher_latest\\\"\"" pod="openstack/openstack-galera-0" podUID="e08506d4-1ca7-4932-b73f-21020cb20578" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.529420 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.650244 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlxgq"] Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.657905 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:30:45 crc kubenswrapper[4904]: W1205 20:30:45.686144 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3437a493_1ffe_49dc_a789_3451b2f87204.slice/crio-0f8c9ac7ddfba29a712a74fd21d486f95193f37493583677d4f2c9eec8ed9243 WatchSource:0}: Error finding container 0f8c9ac7ddfba29a712a74fd21d486f95193f37493583677d4f2c9eec8ed9243: Status 404 returned error can't find the container with id 0f8c9ac7ddfba29a712a74fd21d486f95193f37493583677d4f2c9eec8ed9243 Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.800688 4904 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 05 20:30:45 crc kubenswrapper[4904]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d51f02d6-e4c4-4357-8d5d-6478f452a4e1/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 20:30:45 crc kubenswrapper[4904]: > podSandboxID="2e85a43bf6af16e73c78b877d07c2abc7d99cf1c71cb815bf6d1b21c2e57321a" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.800848 4904 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 05 20:30:45 crc kubenswrapper[4904]: init container &Container{Name:init,Image:38.102.83.9:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbz5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6bccbb886f-2pd5r_openstack(d51f02d6-e4c4-4357-8d5d-6478f452a4e1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d51f02d6-e4c4-4357-8d5d-6478f452a4e1/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 20:30:45 crc kubenswrapper[4904]: > logger="UnhandledError" Dec 05 20:30:45 crc kubenswrapper[4904]: E1205 20:30:45.802086 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d51f02d6-e4c4-4357-8d5d-6478f452a4e1/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" podUID="d51f02d6-e4c4-4357-8d5d-6478f452a4e1" Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.943423 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.954283 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:45 crc kubenswrapper[4904]: I1205 20:30:45.987678 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.032118 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:30:46 crc kubenswrapper[4904]: W1205 20:30:46.040559 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1893f1e0_90ea_4bd5_9275_c0266042485d.slice/crio-816934fa82933f2e9a09a6f7f889cb675265239e0364d97cb696e05bf7de0c8d WatchSource:0}: Error finding container 816934fa82933f2e9a09a6f7f889cb675265239e0364d97cb696e05bf7de0c8d: Status 404 returned error can't find the container with id 816934fa82933f2e9a09a6f7f889cb675265239e0364d97cb696e05bf7de0c8d Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.055248 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.069630 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.069955 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qngrb\" (UniqueName: \"kubernetes.io/projected/8c4ba532-e24b-42d1-8084-ff87bf54e167-kube-api-access-qngrb\") pod \"8c4ba532-e24b-42d1-8084-ff87bf54e167\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070031 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7cqb\" (UniqueName: \"kubernetes.io/projected/9c395630-7b9e-4495-8357-f8cf91879288-kube-api-access-m7cqb\") pod \"9c395630-7b9e-4495-8357-f8cf91879288\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070124 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7mm2\" (UniqueName: \"kubernetes.io/projected/11cdf31c-28af-478e-8230-dc069aa40be5-kube-api-access-n7mm2\") pod \"11cdf31c-28af-478e-8230-dc069aa40be5\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070152 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-dns-svc\") pod \"9c395630-7b9e-4495-8357-f8cf91879288\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070190 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-config\") pod \"9c395630-7b9e-4495-8357-f8cf91879288\" (UID: \"9c395630-7b9e-4495-8357-f8cf91879288\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070231 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-config\") pod \"11cdf31c-28af-478e-8230-dc069aa40be5\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070269 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ba532-e24b-42d1-8084-ff87bf54e167-config\") pod \"8c4ba532-e24b-42d1-8084-ff87bf54e167\" (UID: \"8c4ba532-e24b-42d1-8084-ff87bf54e167\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070286 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-dns-svc\") pod \"11cdf31c-28af-478e-8230-dc069aa40be5\" (UID: \"11cdf31c-28af-478e-8230-dc069aa40be5\") " Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.070947 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11cdf31c-28af-478e-8230-dc069aa40be5" (UID: "11cdf31c-28af-478e-8230-dc069aa40be5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.071323 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c395630-7b9e-4495-8357-f8cf91879288" (UID: "9c395630-7b9e-4495-8357-f8cf91879288"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.071335 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-config" (OuterVolumeSpecName: "config") pod "9c395630-7b9e-4495-8357-f8cf91879288" (UID: "9c395630-7b9e-4495-8357-f8cf91879288"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.071689 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c4ba532-e24b-42d1-8084-ff87bf54e167-config" (OuterVolumeSpecName: "config") pod "8c4ba532-e24b-42d1-8084-ff87bf54e167" (UID: "8c4ba532-e24b-42d1-8084-ff87bf54e167"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.071782 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-config" (OuterVolumeSpecName: "config") pod "11cdf31c-28af-478e-8230-dc069aa40be5" (UID: "11cdf31c-28af-478e-8230-dc069aa40be5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.075273 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cdf31c-28af-478e-8230-dc069aa40be5-kube-api-access-n7mm2" (OuterVolumeSpecName: "kube-api-access-n7mm2") pod "11cdf31c-28af-478e-8230-dc069aa40be5" (UID: "11cdf31c-28af-478e-8230-dc069aa40be5"). InnerVolumeSpecName "kube-api-access-n7mm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.075616 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c395630-7b9e-4495-8357-f8cf91879288-kube-api-access-m7cqb" (OuterVolumeSpecName: "kube-api-access-m7cqb") pod "9c395630-7b9e-4495-8357-f8cf91879288" (UID: "9c395630-7b9e-4495-8357-f8cf91879288"). InnerVolumeSpecName "kube-api-access-m7cqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.075783 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4ba532-e24b-42d1-8084-ff87bf54e167-kube-api-access-qngrb" (OuterVolumeSpecName: "kube-api-access-qngrb") pod "8c4ba532-e24b-42d1-8084-ff87bf54e167" (UID: "8c4ba532-e24b-42d1-8084-ff87bf54e167"). InnerVolumeSpecName "kube-api-access-qngrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.172687 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.172994 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.173009 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ba532-e24b-42d1-8084-ff87bf54e167-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.173020 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cdf31c-28af-478e-8230-dc069aa40be5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.173034 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qngrb\" (UniqueName: \"kubernetes.io/projected/8c4ba532-e24b-42d1-8084-ff87bf54e167-kube-api-access-qngrb\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.173048 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7cqb\" (UniqueName: \"kubernetes.io/projected/9c395630-7b9e-4495-8357-f8cf91879288-kube-api-access-m7cqb\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.173079 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7mm2\" (UniqueName: \"kubernetes.io/projected/11cdf31c-28af-478e-8230-dc069aa40be5-kube-api-access-n7mm2\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.173093 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c395630-7b9e-4495-8357-f8cf91879288-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.181294 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:30:46 crc kubenswrapper[4904]: W1205 20:30:46.187435 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9f105e3_0b5a_435f_bc00_fcfd7eceaafd.slice/crio-40b3e24c2cadc72955d118d14b33114fdf1225bc93d87811910d3b0ebc3b9076 WatchSource:0}: Error finding container 40b3e24c2cadc72955d118d14b33114fdf1225bc93d87811910d3b0ebc3b9076: Status 404 returned error can't find the container with id 40b3e24c2cadc72955d118d14b33114fdf1225bc93d87811910d3b0ebc3b9076 Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.262316 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-h7djt"] Dec 05 20:30:46 crc kubenswrapper[4904]: W1205 20:30:46.270338 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3db65f91_e650_49d5_b372_cabc44efff3f.slice/crio-28186d93dd64a035e4469f12a60d53e3ee2a5927d1826c9e0702f763a870246d WatchSource:0}: Error finding container 28186d93dd64a035e4469f12a60d53e3ee2a5927d1826c9e0702f763a870246d: Status 404 returned error can't find the container with id 28186d93dd64a035e4469f12a60d53e3ee2a5927d1826c9e0702f763a870246d Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.531640 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" event={"ID":"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca","Type":"ContainerStarted","Data":"d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.531776 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.533624 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"980ec67d-9dc9-4cae-8169-9890a40d65c3","Type":"ContainerStarted","Data":"5bab607596c2b40e58376d6bc4f8f9952c0ce3072f53d9960dfd46a044d273d2"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.535314 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" event={"ID":"11cdf31c-28af-478e-8230-dc069aa40be5","Type":"ContainerDied","Data":"69fb41eaaf1f3fa5d524d9b937a4e01cc65186f447d1f9602050b1ec8eddf111"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.535373 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586ffd88f7-d6cls" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.536707 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h7djt" event={"ID":"3db65f91-e650-49d5-b372-cabc44efff3f","Type":"ContainerStarted","Data":"28186d93dd64a035e4469f12a60d53e3ee2a5927d1826c9e0702f763a870246d"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.537783 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f87779c-99qqw" event={"ID":"8c4ba532-e24b-42d1-8084-ff87bf54e167","Type":"ContainerDied","Data":"94609c9d61e0309bd13df4fbc1d436827b4529403857337be5bce29a9f711555"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.537799 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f87779c-99qqw" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.545925 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559648544f-dznst" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.545928 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559648544f-dznst" event={"ID":"9c395630-7b9e-4495-8357-f8cf91879288","Type":"ContainerDied","Data":"7bcff980c0ad20a1eb7fbaa36a9672cb2fcd8c7af6b549d1175ebaeef39eed39"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.547605 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerStarted","Data":"425abf0d293523494d0e0a7f6d1c19228a2d8e562a71abe31a9351619b7a75f0"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.550188 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" podStartSLOduration=3.2593622780000002 podStartE2EDuration="37.55017928s" podCreationTimestamp="2025-12-05 20:30:09 +0000 UTC" firstStartedPulling="2025-12-05 20:30:10.915991847 +0000 UTC m=+1109.727207966" lastFinishedPulling="2025-12-05 20:30:45.206808819 +0000 UTC m=+1144.018024968" observedRunningTime="2025-12-05 20:30:46.548552855 +0000 UTC m=+1145.359768974" watchObservedRunningTime="2025-12-05 20:30:46.55017928 +0000 UTC m=+1145.361395389" Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.558564 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1893f1e0-90ea-4bd5-9275-c0266042485d","Type":"ContainerStarted","Data":"816934fa82933f2e9a09a6f7f889cb675265239e0364d97cb696e05bf7de0c8d"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.560289 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3437a493-1ffe-49dc-a789-3451b2f87204","Type":"ContainerStarted","Data":"99b5ccbad11264cd6ef4ef3d6ec197311f6ab9848ef3b9d8c608c074e0ce09fe"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.560326 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3437a493-1ffe-49dc-a789-3451b2f87204","Type":"ContainerStarted","Data":"0f8c9ac7ddfba29a712a74fd21d486f95193f37493583677d4f2c9eec8ed9243"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.561386 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd","Type":"ContainerStarted","Data":"40b3e24c2cadc72955d118d14b33114fdf1225bc93d87811910d3b0ebc3b9076"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.564296 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq" event={"ID":"41ef8df6-e0e1-45a5-954d-10ce99fa26de","Type":"ContainerStarted","Data":"04a66fb0006fdcfe27ea0c8edd5d55b50c69018362fd6bf07af5408b0534378b"} Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.688050 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586ffd88f7-d6cls"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.710789 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586ffd88f7-d6cls"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.744115 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559648544f-dznst"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.749136 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-559648544f-dznst"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.762575 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f87779c-99qqw"] Dec 05 20:30:46 crc kubenswrapper[4904]: I1205 20:30:46.771174 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75f87779c-99qqw"] Dec 05 20:30:47 crc kubenswrapper[4904]: I1205 20:30:47.433950 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:30:47 crc kubenswrapper[4904]: I1205 20:30:47.690822 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cdf31c-28af-478e-8230-dc069aa40be5" path="/var/lib/kubelet/pods/11cdf31c-28af-478e-8230-dc069aa40be5/volumes" Dec 05 20:30:47 crc kubenswrapper[4904]: I1205 20:30:47.691563 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4ba532-e24b-42d1-8084-ff87bf54e167" path="/var/lib/kubelet/pods/8c4ba532-e24b-42d1-8084-ff87bf54e167/volumes" Dec 05 20:30:47 crc kubenswrapper[4904]: I1205 20:30:47.692114 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c395630-7b9e-4495-8357-f8cf91879288" path="/var/lib/kubelet/pods/9c395630-7b9e-4495-8357-f8cf91879288/volumes" Dec 05 20:30:48 crc kubenswrapper[4904]: W1205 20:30:48.433030 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c47c71a_aff6_4444_ba2a_f3ae27d6cbe0.slice/crio-65790759a624762985abb1252830e3437cc079ce7efb0ddb66fec68ef045883f WatchSource:0}: Error finding container 65790759a624762985abb1252830e3437cc079ce7efb0ddb66fec68ef045883f: Status 404 returned error can't find the container with id 65790759a624762985abb1252830e3437cc079ce7efb0ddb66fec68ef045883f Dec 05 20:30:48 crc kubenswrapper[4904]: I1205 20:30:48.586714 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0","Type":"ContainerStarted","Data":"65790759a624762985abb1252830e3437cc079ce7efb0ddb66fec68ef045883f"} Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.261331 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.371687 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bccbb886f-2pd5r"] Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.807640 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.851466 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbz5d\" (UniqueName: \"kubernetes.io/projected/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-kube-api-access-cbz5d\") pod \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.851635 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-config\") pod \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.851672 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-dns-svc\") pod \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\" (UID: \"d51f02d6-e4c4-4357-8d5d-6478f452a4e1\") " Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.857013 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-kube-api-access-cbz5d" (OuterVolumeSpecName: "kube-api-access-cbz5d") pod "d51f02d6-e4c4-4357-8d5d-6478f452a4e1" (UID: "d51f02d6-e4c4-4357-8d5d-6478f452a4e1"). InnerVolumeSpecName "kube-api-access-cbz5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.871145 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-config" (OuterVolumeSpecName: "config") pod "d51f02d6-e4c4-4357-8d5d-6478f452a4e1" (UID: "d51f02d6-e4c4-4357-8d5d-6478f452a4e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.871543 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d51f02d6-e4c4-4357-8d5d-6478f452a4e1" (UID: "d51f02d6-e4c4-4357-8d5d-6478f452a4e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.953282 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.953314 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:50 crc kubenswrapper[4904]: I1205 20:30:50.953323 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbz5d\" (UniqueName: \"kubernetes.io/projected/d51f02d6-e4c4-4357-8d5d-6478f452a4e1-kube-api-access-cbz5d\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:51 crc kubenswrapper[4904]: I1205 20:30:51.610787 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" event={"ID":"d51f02d6-e4c4-4357-8d5d-6478f452a4e1","Type":"ContainerDied","Data":"2e85a43bf6af16e73c78b877d07c2abc7d99cf1c71cb815bf6d1b21c2e57321a"} Dec 05 20:30:51 crc kubenswrapper[4904]: I1205 20:30:51.610890 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bccbb886f-2pd5r" Dec 05 20:30:51 crc kubenswrapper[4904]: I1205 20:30:51.667740 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bccbb886f-2pd5r"] Dec 05 20:30:51 crc kubenswrapper[4904]: I1205 20:30:51.676211 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bccbb886f-2pd5r"] Dec 05 20:30:51 crc kubenswrapper[4904]: I1205 20:30:51.697913 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51f02d6-e4c4-4357-8d5d-6478f452a4e1" path="/var/lib/kubelet/pods/d51f02d6-e4c4-4357-8d5d-6478f452a4e1/volumes" Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.619845 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd","Type":"ContainerStarted","Data":"f3fe9df8853672a4767570838cfec6ec269f5b1d7198cc27ca4479acec7505ef"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.621485 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq" event={"ID":"41ef8df6-e0e1-45a5-954d-10ce99fa26de","Type":"ContainerStarted","Data":"8f60ed407db75d089544fe473345a90228b129bfb000be6adc2f854aefc23b76"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.621639 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hlxgq" Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.623465 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h7djt" event={"ID":"3db65f91-e650-49d5-b372-cabc44efff3f","Type":"ContainerStarted","Data":"d7472deffe17c69738c27accdf6bafcb26b4ff58cbc1fe9f71dadc5b290b5529"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.624955 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0","Type":"ContainerStarted","Data":"fdbdbd0329d2c79dbeebc450d261cb40a7cf63f6415237a17de45d77fa415e83"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.626906 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1893f1e0-90ea-4bd5-9275-c0266042485d","Type":"ContainerStarted","Data":"25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.626963 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.628695 4904 generic.go:334] "Generic (PLEG): container finished" podID="3437a493-1ffe-49dc-a789-3451b2f87204" containerID="99b5ccbad11264cd6ef4ef3d6ec197311f6ab9848ef3b9d8c608c074e0ce09fe" exitCode=0 Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.628786 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3437a493-1ffe-49dc-a789-3451b2f87204","Type":"ContainerDied","Data":"99b5ccbad11264cd6ef4ef3d6ec197311f6ab9848ef3b9d8c608c074e0ce09fe"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.630599 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"980ec67d-9dc9-4cae-8169-9890a40d65c3","Type":"ContainerStarted","Data":"c4a28b192f4591e0740cd674407445e8ca75dbd4a3f3536f49f4938b13bbb2a9"} Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.630846 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.671295 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=30.714855975 podStartE2EDuration="36.671276389s" podCreationTimestamp="2025-12-05 20:30:16 +0000 UTC" firstStartedPulling="2025-12-05 20:30:46.042852205 +0000 UTC m=+1144.854068314" lastFinishedPulling="2025-12-05 20:30:51.999272599 +0000 UTC m=+1150.810488728" observedRunningTime="2025-12-05 20:30:52.670170459 +0000 UTC m=+1151.481386608" watchObservedRunningTime="2025-12-05 20:30:52.671276389 +0000 UTC m=+1151.482492498" Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.671391 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hlxgq" podStartSLOduration=27.066759597 podStartE2EDuration="32.671386882s" podCreationTimestamp="2025-12-05 20:30:20 +0000 UTC" firstStartedPulling="2025-12-05 20:30:45.664707774 +0000 UTC m=+1144.475923883" lastFinishedPulling="2025-12-05 20:30:51.269335059 +0000 UTC m=+1150.080551168" observedRunningTime="2025-12-05 20:30:52.644505237 +0000 UTC m=+1151.455721356" watchObservedRunningTime="2025-12-05 20:30:52.671386882 +0000 UTC m=+1151.482602991" Dec 05 20:30:52 crc kubenswrapper[4904]: I1205 20:30:52.722433 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=35.798490807 podStartE2EDuration="39.72241318s" podCreationTimestamp="2025-12-05 20:30:13 +0000 UTC" firstStartedPulling="2025-12-05 20:30:46.066228314 +0000 UTC m=+1144.877444423" lastFinishedPulling="2025-12-05 20:30:49.990150687 +0000 UTC m=+1148.801366796" observedRunningTime="2025-12-05 20:30:52.719298054 +0000 UTC m=+1151.530514173" watchObservedRunningTime="2025-12-05 20:30:52.72241318 +0000 UTC m=+1151.533629289" Dec 05 20:30:53 crc kubenswrapper[4904]: I1205 20:30:53.640387 4904 generic.go:334] "Generic (PLEG): container finished" podID="3db65f91-e650-49d5-b372-cabc44efff3f" containerID="d7472deffe17c69738c27accdf6bafcb26b4ff58cbc1fe9f71dadc5b290b5529" exitCode=0 Dec 05 20:30:53 crc kubenswrapper[4904]: I1205 20:30:53.640896 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h7djt" event={"ID":"3db65f91-e650-49d5-b372-cabc44efff3f","Type":"ContainerDied","Data":"d7472deffe17c69738c27accdf6bafcb26b4ff58cbc1fe9f71dadc5b290b5529"} Dec 05 20:30:53 crc kubenswrapper[4904]: I1205 20:30:53.644342 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3437a493-1ffe-49dc-a789-3451b2f87204","Type":"ContainerStarted","Data":"9634f80c0fa37a33916dd6cbc015ba00f6d05b61f73f1dfcacb0348f7c69ca33"} Dec 05 20:30:53 crc kubenswrapper[4904]: I1205 20:30:53.685432 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=41.577793602 podStartE2EDuration="41.685414471s" podCreationTimestamp="2025-12-05 20:30:12 +0000 UTC" firstStartedPulling="2025-12-05 20:30:45.693087072 +0000 UTC m=+1144.504303181" lastFinishedPulling="2025-12-05 20:30:45.800707941 +0000 UTC m=+1144.611924050" observedRunningTime="2025-12-05 20:30:53.678435148 +0000 UTC m=+1152.489651267" watchObservedRunningTime="2025-12-05 20:30:53.685414471 +0000 UTC m=+1152.496630580" Dec 05 20:30:54 crc kubenswrapper[4904]: I1205 20:30:54.177935 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:54 crc kubenswrapper[4904]: I1205 20:30:54.178100 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:54 crc kubenswrapper[4904]: I1205 20:30:54.652082 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerStarted","Data":"45a22d9de21cad981079d2e7659a670ac38e664200c113b9e41923eb1c83a7b9"} Dec 05 20:30:55 crc kubenswrapper[4904]: I1205 20:30:55.660721 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0","Type":"ContainerStarted","Data":"5db13ec9a7cfd377838c2dab6c9a2e7f115e3f13c2faf3cfd83da89964a73104"} Dec 05 20:30:55 crc kubenswrapper[4904]: I1205 20:30:55.663479 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c9f105e3-0b5a-435f-bc00-fcfd7eceaafd","Type":"ContainerStarted","Data":"064056a950a45a007e97800448dfdde8c7ab3e509afc3b79c28ae34d3024d600"} Dec 05 20:30:55 crc kubenswrapper[4904]: I1205 20:30:55.666493 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h7djt" event={"ID":"3db65f91-e650-49d5-b372-cabc44efff3f","Type":"ContainerStarted","Data":"ba5df1bca22ad47d24baeb94ac71589dd452132b26875c61a2420ea43525350f"} Dec 05 20:30:55 crc kubenswrapper[4904]: I1205 20:30:55.666671 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h7djt" event={"ID":"3db65f91-e650-49d5-b372-cabc44efff3f","Type":"ContainerStarted","Data":"48f11e99d4922e5dba477d4bc0d48cc2ee304174c748deb5dddb44ec19c8d629"} Dec 05 20:30:55 crc kubenswrapper[4904]: I1205 20:30:55.715656 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=27.319573946 podStartE2EDuration="33.715629618s" podCreationTimestamp="2025-12-05 20:30:22 +0000 UTC" firstStartedPulling="2025-12-05 20:30:48.434830738 +0000 UTC m=+1147.246046847" lastFinishedPulling="2025-12-05 20:30:54.83088641 +0000 UTC m=+1153.642102519" observedRunningTime="2025-12-05 20:30:55.690168921 +0000 UTC m=+1154.501385040" watchObservedRunningTime="2025-12-05 20:30:55.715629618 +0000 UTC m=+1154.526845767" Dec 05 20:30:55 crc kubenswrapper[4904]: I1205 20:30:55.732928 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-h7djt" podStartSLOduration=31.107727002 podStartE2EDuration="35.732908278s" podCreationTimestamp="2025-12-05 20:30:20 +0000 UTC" firstStartedPulling="2025-12-05 20:30:46.272711568 +0000 UTC m=+1145.083927677" lastFinishedPulling="2025-12-05 20:30:50.897892844 +0000 UTC m=+1149.709108953" observedRunningTime="2025-12-05 20:30:55.731501209 +0000 UTC m=+1154.542717318" watchObservedRunningTime="2025-12-05 20:30:55.732908278 +0000 UTC m=+1154.544124417" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.670261 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.674832 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.674876 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.721739 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.757281 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=26.09892699 podStartE2EDuration="34.757252953s" podCreationTimestamp="2025-12-05 20:30:22 +0000 UTC" firstStartedPulling="2025-12-05 20:30:46.18923151 +0000 UTC m=+1145.000447609" lastFinishedPulling="2025-12-05 20:30:54.847557463 +0000 UTC m=+1153.658773572" observedRunningTime="2025-12-05 20:30:55.786089625 +0000 UTC m=+1154.597305734" watchObservedRunningTime="2025-12-05 20:30:56.757252953 +0000 UTC m=+1155.568469102" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.869393 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:56 crc kubenswrapper[4904]: I1205 20:30:56.942365 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:57 crc kubenswrapper[4904]: E1205 20:30:57.029227 4904 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:40582->38.102.83.166:45757: write tcp 38.102.83.166:40582->38.102.83.166:45757: write: broken pipe Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.198927 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.694114 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"921f0ddc-4d15-4bfa-9560-7a01eaa3461f","Type":"ContainerStarted","Data":"2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a"} Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.694186 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.694202 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.740115 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.744733 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.933154 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9d55c97-ddvbw"] Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.934393 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.936273 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.957507 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9d55c97-ddvbw"] Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.975546 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-t974s"] Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.976680 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.980101 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.989122 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-t974s"] Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.989858 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqhh\" (UniqueName: \"kubernetes.io/projected/d85f3a5d-f238-475c-9994-ac5c7fe16490-kube-api-access-cpqhh\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.989918 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-ovsdbserver-sb\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.989958 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-dns-svc\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:57 crc kubenswrapper[4904]: I1205 20:30:57.989986 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-config\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.073479 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9d55c97-ddvbw"] Dec 05 20:30:58 crc kubenswrapper[4904]: E1205 20:30:58.074038 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-cpqhh ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" podUID="d85f3a5d-f238-475c-9994-ac5c7fe16490" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.091974 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-ovsdbserver-sb\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092033 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7529566-97c3-42bf-a66c-1186aec23176-ovs-rundir\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092069 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7529566-97c3-42bf-a66c-1186aec23176-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092092 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnxp\" (UniqueName: \"kubernetes.io/projected/e7529566-97c3-42bf-a66c-1186aec23176-kube-api-access-5jnxp\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092115 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7529566-97c3-42bf-a66c-1186aec23176-config\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092137 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-dns-svc\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092164 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-config\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092186 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7529566-97c3-42bf-a66c-1186aec23176-combined-ca-bundle\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092252 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7529566-97c3-42bf-a66c-1186aec23176-ovn-rundir\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.092285 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqhh\" (UniqueName: \"kubernetes.io/projected/d85f3a5d-f238-475c-9994-ac5c7fe16490-kube-api-access-cpqhh\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.093335 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-config\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.093426 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-dns-svc\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.093424 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-ovsdbserver-sb\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.121095 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86578b98fc-xpmpj"] Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.123658 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.124894 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqhh\" (UniqueName: \"kubernetes.io/projected/d85f3a5d-f238-475c-9994-ac5c7fe16490-kube-api-access-cpqhh\") pod \"dnsmasq-dns-b9d55c97-ddvbw\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.125410 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.137940 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86578b98fc-xpmpj"] Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.187420 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.188714 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.193512 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.194331 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-nrw25" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.194785 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.194901 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7529566-97c3-42bf-a66c-1186aec23176-combined-ca-bundle\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195132 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-sb\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195194 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dqkk\" (UniqueName: \"kubernetes.io/projected/6b4c1024-2b13-4877-8f60-4833f9d3e8df-kube-api-access-9dqkk\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195226 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-config\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195281 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-nb\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195313 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7529566-97c3-42bf-a66c-1186aec23176-ovn-rundir\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195337 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-dns-svc\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195426 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7529566-97c3-42bf-a66c-1186aec23176-ovs-rundir\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195462 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7529566-97c3-42bf-a66c-1186aec23176-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195690 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7529566-97c3-42bf-a66c-1186aec23176-ovs-rundir\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195740 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7529566-97c3-42bf-a66c-1186aec23176-ovn-rundir\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195756 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnxp\" (UniqueName: \"kubernetes.io/projected/e7529566-97c3-42bf-a66c-1186aec23176-kube-api-access-5jnxp\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.195851 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7529566-97c3-42bf-a66c-1186aec23176-config\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.196789 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7529566-97c3-42bf-a66c-1186aec23176-config\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.199569 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.204738 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7529566-97c3-42bf-a66c-1186aec23176-combined-ca-bundle\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.209772 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.214551 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7529566-97c3-42bf-a66c-1186aec23176-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.223776 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnxp\" (UniqueName: \"kubernetes.io/projected/e7529566-97c3-42bf-a66c-1186aec23176-kube-api-access-5jnxp\") pod \"ovn-controller-metrics-t974s\" (UID: \"e7529566-97c3-42bf-a66c-1186aec23176\") " pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.286260 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297038 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297115 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297170 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c15a10eb-9132-4ec6-8861-4c2320962cc3-scripts\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297215 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c15a10eb-9132-4ec6-8861-4c2320962cc3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297251 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15a10eb-9132-4ec6-8861-4c2320962cc3-config\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297272 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297305 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-sb\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297324 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-config\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297341 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dqkk\" (UniqueName: \"kubernetes.io/projected/6b4c1024-2b13-4877-8f60-4833f9d3e8df-kube-api-access-9dqkk\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297375 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-nb\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-dns-svc\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.297409 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fchjv\" (UniqueName: \"kubernetes.io/projected/c15a10eb-9132-4ec6-8861-4c2320962cc3-kube-api-access-fchjv\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.298346 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-sb\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.303768 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-config\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.304556 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-dns-svc\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.304758 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-nb\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.304930 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-t974s" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.320558 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dqkk\" (UniqueName: \"kubernetes.io/projected/6b4c1024-2b13-4877-8f60-4833f9d3e8df-kube-api-access-9dqkk\") pod \"dnsmasq-dns-86578b98fc-xpmpj\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.398976 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399025 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399088 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c15a10eb-9132-4ec6-8861-4c2320962cc3-scripts\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399130 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c15a10eb-9132-4ec6-8861-4c2320962cc3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399178 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15a10eb-9132-4ec6-8861-4c2320962cc3-config\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399199 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399251 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fchjv\" (UniqueName: \"kubernetes.io/projected/c15a10eb-9132-4ec6-8861-4c2320962cc3-kube-api-access-fchjv\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.399929 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c15a10eb-9132-4ec6-8861-4c2320962cc3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.400143 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c15a10eb-9132-4ec6-8861-4c2320962cc3-scripts\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.400338 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15a10eb-9132-4ec6-8861-4c2320962cc3-config\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.403374 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.404449 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.408536 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15a10eb-9132-4ec6-8861-4c2320962cc3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.408792 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.416575 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fchjv\" (UniqueName: \"kubernetes.io/projected/c15a10eb-9132-4ec6-8861-4c2320962cc3-kube-api-access-fchjv\") pod \"ovn-northd-0\" (UID: \"c15a10eb-9132-4ec6-8861-4c2320962cc3\") " pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.478970 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.542440 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.709024 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.767645 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-t974s"] Dec 05 20:30:58 crc kubenswrapper[4904]: W1205 20:30:58.772327 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7529566_97c3_42bf_a66c_1186aec23176.slice/crio-ed09a83a0c9b04dda3e3829e2b3b79ddc43bbc1651df663ceeee291d6f94d0eb WatchSource:0}: Error finding container ed09a83a0c9b04dda3e3829e2b3b79ddc43bbc1651df663ceeee291d6f94d0eb: Status 404 returned error can't find the container with id ed09a83a0c9b04dda3e3829e2b3b79ddc43bbc1651df663ceeee291d6f94d0eb Dec 05 20:30:58 crc kubenswrapper[4904]: I1205 20:30:58.789900 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86578b98fc-xpmpj"] Dec 05 20:30:58 crc kubenswrapper[4904]: W1205 20:30:58.791568 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b4c1024_2b13_4877_8f60_4833f9d3e8df.slice/crio-7949dcfb06d96b9c07ddfd05bce4d24ea100a4bc538bc9d577311d32e4cad04a WatchSource:0}: Error finding container 7949dcfb06d96b9c07ddfd05bce4d24ea100a4bc538bc9d577311d32e4cad04a: Status 404 returned error can't find the container with id 7949dcfb06d96b9c07ddfd05bce4d24ea100a4bc538bc9d577311d32e4cad04a Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.040714 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:59 crc kubenswrapper[4904]: W1205 20:30:59.072420 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc15a10eb_9132_4ec6_8861_4c2320962cc3.slice/crio-c477daa3e561f8efab8f932f4f16d6c1eb0c4a37460e4262c0777303be66cddd WatchSource:0}: Error finding container c477daa3e561f8efab8f932f4f16d6c1eb0c4a37460e4262c0777303be66cddd: Status 404 returned error can't find the container with id c477daa3e561f8efab8f932f4f16d6c1eb0c4a37460e4262c0777303be66cddd Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.081183 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.215102 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-config\") pod \"d85f3a5d-f238-475c-9994-ac5c7fe16490\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.215374 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-ovsdbserver-sb\") pod \"d85f3a5d-f238-475c-9994-ac5c7fe16490\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.215434 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-dns-svc\") pod \"d85f3a5d-f238-475c-9994-ac5c7fe16490\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.215491 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpqhh\" (UniqueName: \"kubernetes.io/projected/d85f3a5d-f238-475c-9994-ac5c7fe16490-kube-api-access-cpqhh\") pod \"d85f3a5d-f238-475c-9994-ac5c7fe16490\" (UID: \"d85f3a5d-f238-475c-9994-ac5c7fe16490\") " Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.215789 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-config" (OuterVolumeSpecName: "config") pod "d85f3a5d-f238-475c-9994-ac5c7fe16490" (UID: "d85f3a5d-f238-475c-9994-ac5c7fe16490"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.216341 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d85f3a5d-f238-475c-9994-ac5c7fe16490" (UID: "d85f3a5d-f238-475c-9994-ac5c7fe16490"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.216444 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d85f3a5d-f238-475c-9994-ac5c7fe16490" (UID: "d85f3a5d-f238-475c-9994-ac5c7fe16490"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.222037 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85f3a5d-f238-475c-9994-ac5c7fe16490-kube-api-access-cpqhh" (OuterVolumeSpecName: "kube-api-access-cpqhh") pod "d85f3a5d-f238-475c-9994-ac5c7fe16490" (UID: "d85f3a5d-f238-475c-9994-ac5c7fe16490"). InnerVolumeSpecName "kube-api-access-cpqhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.317184 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.317224 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.317235 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d85f3a5d-f238-475c-9994-ac5c7fe16490-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.317245 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpqhh\" (UniqueName: \"kubernetes.io/projected/d85f3a5d-f238-475c-9994-ac5c7fe16490-kube-api-access-cpqhh\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.367550 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.721728 4904 generic.go:334] "Generic (PLEG): container finished" podID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerID="cca5c730f9a7ad471de36beb5592f757a38de909d36427d20afb414169e77dc2" exitCode=0 Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.722217 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" event={"ID":"6b4c1024-2b13-4877-8f60-4833f9d3e8df","Type":"ContainerDied","Data":"cca5c730f9a7ad471de36beb5592f757a38de909d36427d20afb414169e77dc2"} Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.722287 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" event={"ID":"6b4c1024-2b13-4877-8f60-4833f9d3e8df","Type":"ContainerStarted","Data":"7949dcfb06d96b9c07ddfd05bce4d24ea100a4bc538bc9d577311d32e4cad04a"} Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.725203 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-t974s" event={"ID":"e7529566-97c3-42bf-a66c-1186aec23176","Type":"ContainerStarted","Data":"9e8592f0f2213473a82616335d72eddad1f7eb3d38778203aeab3e67176515e5"} Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.725245 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-t974s" event={"ID":"e7529566-97c3-42bf-a66c-1186aec23176","Type":"ContainerStarted","Data":"ed09a83a0c9b04dda3e3829e2b3b79ddc43bbc1651df663ceeee291d6f94d0eb"} Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.734197 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"0caaad94-d02e-43da-bf3b-087a5ec8d2f8","Type":"ContainerStarted","Data":"05867c3143e7e550817d33a46cd977302277bd1126239c8ac6e35fcfc5645f6d"} Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.744878 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c15a10eb-9132-4ec6-8861-4c2320962cc3","Type":"ContainerStarted","Data":"c477daa3e561f8efab8f932f4f16d6c1eb0c4a37460e4262c0777303be66cddd"} Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.745005 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d55c97-ddvbw" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.788030 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-t974s" podStartSLOduration=2.788004865 podStartE2EDuration="2.788004865s" podCreationTimestamp="2025-12-05 20:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:30:59.783639853 +0000 UTC m=+1158.594855982" watchObservedRunningTime="2025-12-05 20:30:59.788004865 +0000 UTC m=+1158.599220994" Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.921248 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9d55c97-ddvbw"] Dec 05 20:30:59 crc kubenswrapper[4904]: I1205 20:30:59.934995 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9d55c97-ddvbw"] Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.763238 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c15a10eb-9132-4ec6-8861-4c2320962cc3","Type":"ContainerStarted","Data":"1f00724d3d443ffc7a365ac4e7d1eae6a35a6a456a046b8544184778606f0f52"} Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.763616 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c15a10eb-9132-4ec6-8861-4c2320962cc3","Type":"ContainerStarted","Data":"ae35d27ee452c64e409f83770554e69aa05bf42701063464cc0aeb1909c10a8e"} Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.763658 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.772210 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" event={"ID":"6b4c1024-2b13-4877-8f60-4833f9d3e8df","Type":"ContainerStarted","Data":"e8eff6fece62ab87a9769f34fda77ba4aa4b1c10a43a4b0cf8bf238675132333"} Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.773754 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.815482 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.343081518 podStartE2EDuration="2.815459466s" podCreationTimestamp="2025-12-05 20:30:58 +0000 UTC" firstStartedPulling="2025-12-05 20:30:59.07520142 +0000 UTC m=+1157.886417529" lastFinishedPulling="2025-12-05 20:30:59.547579368 +0000 UTC m=+1158.358795477" observedRunningTime="2025-12-05 20:31:00.808538263 +0000 UTC m=+1159.619754392" watchObservedRunningTime="2025-12-05 20:31:00.815459466 +0000 UTC m=+1159.626675575" Dec 05 20:31:00 crc kubenswrapper[4904]: I1205 20:31:00.832944 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" podStartSLOduration=2.832928631 podStartE2EDuration="2.832928631s" podCreationTimestamp="2025-12-05 20:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:00.827753587 +0000 UTC m=+1159.638969706" watchObservedRunningTime="2025-12-05 20:31:00.832928631 +0000 UTC m=+1159.644144740" Dec 05 20:31:01 crc kubenswrapper[4904]: I1205 20:31:01.691370 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85f3a5d-f238-475c-9994-ac5c7fe16490" path="/var/lib/kubelet/pods/d85f3a5d-f238-475c-9994-ac5c7fe16490/volumes" Dec 05 20:31:01 crc kubenswrapper[4904]: I1205 20:31:01.780420 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e08506d4-1ca7-4932-b73f-21020cb20578","Type":"ContainerStarted","Data":"8ff6e0b9e14522c2e04127ea1fd3e9b0e9392ef571e716fdba832d536aa3a20f"} Dec 05 20:31:02 crc kubenswrapper[4904]: I1205 20:31:02.789497 4904 generic.go:334] "Generic (PLEG): container finished" podID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerID="45a22d9de21cad981079d2e7659a670ac38e664200c113b9e41923eb1c83a7b9" exitCode=0 Dec 05 20:31:02 crc kubenswrapper[4904]: I1205 20:31:02.789600 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerDied","Data":"45a22d9de21cad981079d2e7659a670ac38e664200c113b9e41923eb1c83a7b9"} Dec 05 20:31:02 crc kubenswrapper[4904]: I1205 20:31:02.792806 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ad24986-23a3-4010-8dcf-6778339691c8","Type":"ContainerStarted","Data":"430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65"} Dec 05 20:31:05 crc kubenswrapper[4904]: I1205 20:31:05.820669 4904 generic.go:334] "Generic (PLEG): container finished" podID="e08506d4-1ca7-4932-b73f-21020cb20578" containerID="8ff6e0b9e14522c2e04127ea1fd3e9b0e9392ef571e716fdba832d536aa3a20f" exitCode=0 Dec 05 20:31:05 crc kubenswrapper[4904]: I1205 20:31:05.820893 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e08506d4-1ca7-4932-b73f-21020cb20578","Type":"ContainerDied","Data":"8ff6e0b9e14522c2e04127ea1fd3e9b0e9392ef571e716fdba832d536aa3a20f"} Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.094299 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86578b98fc-xpmpj"] Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.095528 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerName="dnsmasq-dns" containerID="cri-o://e8eff6fece62ab87a9769f34fda77ba4aa4b1c10a43a4b0cf8bf238675132333" gracePeriod=10 Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.100258 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.148776 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d4d549fbf-m45vw"] Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.150304 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.181578 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4d549fbf-m45vw"] Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.276429 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.276668 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm79l\" (UniqueName: \"kubernetes.io/projected/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-kube-api-access-vm79l\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.276690 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-config\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.276757 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.276873 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-dns-svc\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.378180 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.378237 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-dns-svc\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.378296 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.378320 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm79l\" (UniqueName: \"kubernetes.io/projected/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-kube-api-access-vm79l\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.378338 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-config\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.379148 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-dns-svc\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.379429 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-config\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.380291 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.380829 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.416254 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm79l\" (UniqueName: \"kubernetes.io/projected/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-kube-api-access-vm79l\") pod \"dnsmasq-dns-6d4d549fbf-m45vw\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.538885 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.851069 4904 generic.go:334] "Generic (PLEG): container finished" podID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerID="e8eff6fece62ab87a9769f34fda77ba4aa4b1c10a43a4b0cf8bf238675132333" exitCode=0 Dec 05 20:31:07 crc kubenswrapper[4904]: I1205 20:31:07.851114 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" event={"ID":"6b4c1024-2b13-4877-8f60-4833f9d3e8df","Type":"ContainerDied","Data":"e8eff6fece62ab87a9769f34fda77ba4aa4b1c10a43a4b0cf8bf238675132333"} Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.148132 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.196929 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-sb\") pod \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.197001 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dqkk\" (UniqueName: \"kubernetes.io/projected/6b4c1024-2b13-4877-8f60-4833f9d3e8df-kube-api-access-9dqkk\") pod \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.197133 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-nb\") pod \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.197732 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-config\") pod \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.197869 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-dns-svc\") pod \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\" (UID: \"6b4c1024-2b13-4877-8f60-4833f9d3e8df\") " Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.202505 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4c1024-2b13-4877-8f60-4833f9d3e8df-kube-api-access-9dqkk" (OuterVolumeSpecName: "kube-api-access-9dqkk") pod "6b4c1024-2b13-4877-8f60-4833f9d3e8df" (UID: "6b4c1024-2b13-4877-8f60-4833f9d3e8df"). InnerVolumeSpecName "kube-api-access-9dqkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.256474 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.256843 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerName="init" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.256888 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerName="init" Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.256898 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerName="dnsmasq-dns" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.256904 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerName="dnsmasq-dns" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.257091 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" containerName="dnsmasq-dns" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.262281 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.263693 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-config" (OuterVolumeSpecName: "config") pod "6b4c1024-2b13-4877-8f60-4833f9d3e8df" (UID: "6b4c1024-2b13-4877-8f60-4833f9d3e8df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.265504 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.265570 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.265802 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.266013 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hr2v9" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.277914 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.281450 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b4c1024-2b13-4877-8f60-4833f9d3e8df" (UID: "6b4c1024-2b13-4877-8f60-4833f9d3e8df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.282698 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b4c1024-2b13-4877-8f60-4833f9d3e8df" (UID: "6b4c1024-2b13-4877-8f60-4833f9d3e8df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.282863 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b4c1024-2b13-4877-8f60-4833f9d3e8df" (UID: "6b4c1024-2b13-4877-8f60-4833f9d3e8df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305361 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305421 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-lock\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305509 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9j6\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-kube-api-access-cd9j6\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305535 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305567 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-cache\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305617 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305629 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305638 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dqkk\" (UniqueName: \"kubernetes.io/projected/6b4c1024-2b13-4877-8f60-4833f9d3e8df-kube-api-access-9dqkk\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305646 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.305656 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4c1024-2b13-4877-8f60-4833f9d3e8df-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.389374 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4d549fbf-m45vw"] Dec 05 20:31:08 crc kubenswrapper[4904]: W1205 20:31:08.391138 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2a3f1f6_8ec4_4b73_94a1_78f12808fe55.slice/crio-6a318139a8255f79de5c859bc16282894c91042acdd7072586ffb99c61e75470 WatchSource:0}: Error finding container 6a318139a8255f79de5c859bc16282894c91042acdd7072586ffb99c61e75470: Status 404 returned error can't find the container with id 6a318139a8255f79de5c859bc16282894c91042acdd7072586ffb99c61e75470 Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.406312 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.406610 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-lock\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.406476 4904 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.406667 4904 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.406668 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9j6\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-kube-api-access-cd9j6\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.406688 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.406712 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift podName:be5ed3b2-bc48-4865-ade5-f7c2e379a1ea nodeName:}" failed. No retries permitted until 2025-12-05 20:31:08.906695097 +0000 UTC m=+1167.717911206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift") pod "swift-storage-0" (UID: "be5ed3b2-bc48-4865-ade5-f7c2e379a1ea") : configmap "swift-ring-files" not found Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.406727 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-cache\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.406926 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.407095 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-lock\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.407126 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-cache\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.428248 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9j6\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-kube-api-access-cd9j6\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.433206 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.732031 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bd4bb"] Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.733766 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.768097 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bd4bb"] Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.768695 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.768978 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.769164 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.786220 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bd4bb"] Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.807153 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-d7w7v ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-d7w7v ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-bd4bb" podUID="a09441d1-761f-4cae-8c0a-eb9df4797f48" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812463 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-dispersionconf\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812517 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a09441d1-761f-4cae-8c0a-eb9df4797f48-etc-swift\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812537 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-ring-data-devices\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812566 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-scripts\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812587 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-swiftconf\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812605 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7w7v\" (UniqueName: \"kubernetes.io/projected/a09441d1-761f-4cae-8c0a-eb9df4797f48-kube-api-access-d7w7v\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.812654 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-combined-ca-bundle\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.817192 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9xq46"] Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.818455 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.835623 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9xq46"] Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.860271 4904 generic.go:334] "Generic (PLEG): container finished" podID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerID="f172fe57cf479e24045d628f4b8a29122b29ed742dc96f3af2f99b250799ad36" exitCode=0 Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.860332 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" event={"ID":"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55","Type":"ContainerDied","Data":"f172fe57cf479e24045d628f4b8a29122b29ed742dc96f3af2f99b250799ad36"} Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.860357 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" event={"ID":"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55","Type":"ContainerStarted","Data":"6a318139a8255f79de5c859bc16282894c91042acdd7072586ffb99c61e75470"} Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.862201 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e08506d4-1ca7-4932-b73f-21020cb20578","Type":"ContainerStarted","Data":"4e740a73c8f5c9f0dd574ec681c78e5a6f7acdba1829bad677f35495d122050c"} Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.864874 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" event={"ID":"6b4c1024-2b13-4877-8f60-4833f9d3e8df","Type":"ContainerDied","Data":"7949dcfb06d96b9c07ddfd05bce4d24ea100a4bc538bc9d577311d32e4cad04a"} Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.864931 4904 scope.go:117] "RemoveContainer" containerID="e8eff6fece62ab87a9769f34fda77ba4aa4b1c10a43a4b0cf8bf238675132333" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.864896 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86578b98fc-xpmpj" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.879887 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.879911 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerStarted","Data":"bc6bd55a707cd7c462889940bc35e995374b4c3995de80745ca6bb1ed256e0d0"} Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.913527 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-scripts\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.913895 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-dispersionconf\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.913960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a09441d1-761f-4cae-8c0a-eb9df4797f48-etc-swift\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.913980 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-ring-data-devices\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914006 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-combined-ca-bundle\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914038 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b80a797-4212-4242-81fd-928045b629cd-etc-swift\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914084 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-ring-data-devices\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914112 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-scripts\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914134 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwp9b\" (UniqueName: \"kubernetes.io/projected/4b80a797-4212-4242-81fd-928045b629cd-kube-api-access-fwp9b\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914164 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-swiftconf\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914191 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7w7v\" (UniqueName: \"kubernetes.io/projected/a09441d1-761f-4cae-8c0a-eb9df4797f48-kube-api-access-d7w7v\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914232 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914263 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-dispersionconf\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914306 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-combined-ca-bundle\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.914374 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-swiftconf\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.915952 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-scripts\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.917763 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-ring-data-devices\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.917862 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a09441d1-761f-4cae-8c0a-eb9df4797f48-etc-swift\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.918117 4904 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.918136 4904 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:31:08 crc kubenswrapper[4904]: E1205 20:31:08.918173 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift podName:be5ed3b2-bc48-4865-ade5-f7c2e379a1ea nodeName:}" failed. No retries permitted until 2025-12-05 20:31:09.91815865 +0000 UTC m=+1168.729374759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift") pod "swift-storage-0" (UID: "be5ed3b2-bc48-4865-ade5-f7c2e379a1ea") : configmap "swift-ring-files" not found Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.918734 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371978.93606 podStartE2EDuration="57.918716055s" podCreationTimestamp="2025-12-05 20:30:11 +0000 UTC" firstStartedPulling="2025-12-05 20:30:13.320638082 +0000 UTC m=+1112.131854191" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:08.913800249 +0000 UTC m=+1167.725016368" watchObservedRunningTime="2025-12-05 20:31:08.918716055 +0000 UTC m=+1167.729932164" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.919586 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-swiftconf\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.922287 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-dispersionconf\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.931752 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-combined-ca-bundle\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.940926 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7w7v\" (UniqueName: \"kubernetes.io/projected/a09441d1-761f-4cae-8c0a-eb9df4797f48-kube-api-access-d7w7v\") pod \"swift-ring-rebalance-bd4bb\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:08 crc kubenswrapper[4904]: I1205 20:31:08.992941 4904 scope.go:117] "RemoveContainer" containerID="cca5c730f9a7ad471de36beb5592f757a38de909d36427d20afb414169e77dc2" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015747 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-scripts\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015823 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-combined-ca-bundle\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015855 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b80a797-4212-4242-81fd-928045b629cd-etc-swift\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015872 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-ring-data-devices\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015889 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwp9b\" (UniqueName: \"kubernetes.io/projected/4b80a797-4212-4242-81fd-928045b629cd-kube-api-access-fwp9b\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015946 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-dispersionconf\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.015996 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-swiftconf\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.017245 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-scripts\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.018622 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.018669 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b80a797-4212-4242-81fd-928045b629cd-etc-swift\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.019561 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-ring-data-devices\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.021405 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-swiftconf\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.021850 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-dispersionconf\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.039563 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwp9b\" (UniqueName: \"kubernetes.io/projected/4b80a797-4212-4242-81fd-928045b629cd-kube-api-access-fwp9b\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.045028 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86578b98fc-xpmpj"] Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.051656 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-combined-ca-bundle\") pod \"swift-ring-rebalance-9xq46\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.053793 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86578b98fc-xpmpj"] Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117454 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-scripts\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117546 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-dispersionconf\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117625 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7w7v\" (UniqueName: \"kubernetes.io/projected/a09441d1-761f-4cae-8c0a-eb9df4797f48-kube-api-access-d7w7v\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117669 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a09441d1-761f-4cae-8c0a-eb9df4797f48-etc-swift\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117695 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-swiftconf\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117758 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-combined-ca-bundle\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.117967 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-scripts" (OuterVolumeSpecName: "scripts") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.118076 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09441d1-761f-4cae-8c0a-eb9df4797f48-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.118408 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-ring-data-devices\") pod \"a09441d1-761f-4cae-8c0a-eb9df4797f48\" (UID: \"a09441d1-761f-4cae-8c0a-eb9df4797f48\") " Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.118755 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.119253 4904 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.119271 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a09441d1-761f-4cae-8c0a-eb9df4797f48-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.119281 4904 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a09441d1-761f-4cae-8c0a-eb9df4797f48-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.120852 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.123173 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.123206 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.123215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09441d1-761f-4cae-8c0a-eb9df4797f48-kube-api-access-d7w7v" (OuterVolumeSpecName: "kube-api-access-d7w7v") pod "a09441d1-761f-4cae-8c0a-eb9df4797f48" (UID: "a09441d1-761f-4cae-8c0a-eb9df4797f48"). InnerVolumeSpecName "kube-api-access-d7w7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.139734 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.247975 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7w7v\" (UniqueName: \"kubernetes.io/projected/a09441d1-761f-4cae-8c0a-eb9df4797f48-kube-api-access-d7w7v\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.248024 4904 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.248037 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.248048 4904 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a09441d1-761f-4cae-8c0a-eb9df4797f48-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.606339 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9xq46"] Dec 05 20:31:09 crc kubenswrapper[4904]: W1205 20:31:09.611683 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b80a797_4212_4242_81fd_928045b629cd.slice/crio-ac9b9bb7daf1831e3e48a83dcb12c614c10a0369140f036d6214d72bbf565933 WatchSource:0}: Error finding container ac9b9bb7daf1831e3e48a83dcb12c614c10a0369140f036d6214d72bbf565933: Status 404 returned error can't find the container with id ac9b9bb7daf1831e3e48a83dcb12c614c10a0369140f036d6214d72bbf565933 Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.696677 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4c1024-2b13-4877-8f60-4833f9d3e8df" path="/var/lib/kubelet/pods/6b4c1024-2b13-4877-8f60-4833f9d3e8df/volumes" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.888233 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9xq46" event={"ID":"4b80a797-4212-4242-81fd-928045b629cd","Type":"ContainerStarted","Data":"ac9b9bb7daf1831e3e48a83dcb12c614c10a0369140f036d6214d72bbf565933"} Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.891642 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" event={"ID":"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55","Type":"ContainerStarted","Data":"35f4a9689c718072f5071f4ffac0118023242067b5cf212b6fb449cf01fb3a74"} Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.891705 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.893482 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bd4bb" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.918697 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" podStartSLOduration=2.918676673 podStartE2EDuration="2.918676673s" podCreationTimestamp="2025-12-05 20:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:09.910543228 +0000 UTC m=+1168.721759337" watchObservedRunningTime="2025-12-05 20:31:09.918676673 +0000 UTC m=+1168.729892782" Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.947411 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bd4bb"] Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.954569 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-bd4bb"] Dec 05 20:31:09 crc kubenswrapper[4904]: I1205 20:31:09.958163 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:09 crc kubenswrapper[4904]: E1205 20:31:09.958423 4904 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:31:09 crc kubenswrapper[4904]: E1205 20:31:09.958456 4904 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:31:09 crc kubenswrapper[4904]: E1205 20:31:09.958508 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift podName:be5ed3b2-bc48-4865-ade5-f7c2e379a1ea nodeName:}" failed. No retries permitted until 2025-12-05 20:31:11.958488299 +0000 UTC m=+1170.769704408 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift") pod "swift-storage-0" (UID: "be5ed3b2-bc48-4865-ade5-f7c2e379a1ea") : configmap "swift-ring-files" not found Dec 05 20:31:10 crc kubenswrapper[4904]: I1205 20:31:10.927265 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerStarted","Data":"9bd6982949de74604d2b98a2e96167ee2081d2881e85a4cf5ec787dfedaa9751"} Dec 05 20:31:11 crc kubenswrapper[4904]: I1205 20:31:11.694490 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09441d1-761f-4cae-8c0a-eb9df4797f48" path="/var/lib/kubelet/pods/a09441d1-761f-4cae-8c0a-eb9df4797f48/volumes" Dec 05 20:31:11 crc kubenswrapper[4904]: I1205 20:31:11.989780 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:11 crc kubenswrapper[4904]: E1205 20:31:11.990010 4904 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:31:11 crc kubenswrapper[4904]: E1205 20:31:11.990031 4904 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:31:11 crc kubenswrapper[4904]: E1205 20:31:11.990099 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift podName:be5ed3b2-bc48-4865-ade5-f7c2e379a1ea nodeName:}" failed. No retries permitted until 2025-12-05 20:31:15.990081745 +0000 UTC m=+1174.801297854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift") pod "swift-storage-0" (UID: "be5ed3b2-bc48-4865-ade5-f7c2e379a1ea") : configmap "swift-ring-files" not found Dec 05 20:31:12 crc kubenswrapper[4904]: I1205 20:31:12.782906 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 20:31:12 crc kubenswrapper[4904]: I1205 20:31:12.783252 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 20:31:12 crc kubenswrapper[4904]: I1205 20:31:12.904792 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 20:31:12 crc kubenswrapper[4904]: I1205 20:31:12.942937 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9xq46" event={"ID":"4b80a797-4212-4242-81fd-928045b629cd","Type":"ContainerStarted","Data":"4ed1029d0887f45b89cad83436ce354b990d2e271be387d893f8305d23e56642"} Dec 05 20:31:12 crc kubenswrapper[4904]: I1205 20:31:12.965232 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9xq46" podStartSLOduration=2.423571073 podStartE2EDuration="4.965209933s" podCreationTimestamp="2025-12-05 20:31:08 +0000 UTC" firstStartedPulling="2025-12-05 20:31:09.613644152 +0000 UTC m=+1168.424860261" lastFinishedPulling="2025-12-05 20:31:12.155283012 +0000 UTC m=+1170.966499121" observedRunningTime="2025-12-05 20:31:12.959868445 +0000 UTC m=+1171.771084564" watchObservedRunningTime="2025-12-05 20:31:12.965209933 +0000 UTC m=+1171.776426042" Dec 05 20:31:13 crc kubenswrapper[4904]: I1205 20:31:13.055574 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 20:31:13 crc kubenswrapper[4904]: I1205 20:31:13.621034 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 20:31:13 crc kubenswrapper[4904]: I1205 20:31:13.977583 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6a25-account-create-update-tn8bx"] Dec 05 20:31:13 crc kubenswrapper[4904]: I1205 20:31:13.978968 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:13 crc kubenswrapper[4904]: I1205 20:31:13.981728 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 20:31:13 crc kubenswrapper[4904]: I1205 20:31:13.986781 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6a25-account-create-update-tn8bx"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.050517 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pmt5n"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.051575 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.058940 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pmt5n"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.131824 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc955d4-a981-42e4-b327-558d95c0a9c0-operator-scripts\") pod \"keystone-6a25-account-create-update-tn8bx\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.131938 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9b2\" (UniqueName: \"kubernetes.io/projected/8fc955d4-a981-42e4-b327-558d95c0a9c0-kube-api-access-4z9b2\") pod \"keystone-6a25-account-create-update-tn8bx\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.235895 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-operator-scripts\") pod \"keystone-db-create-pmt5n\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.236260 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9b2\" (UniqueName: \"kubernetes.io/projected/8fc955d4-a981-42e4-b327-558d95c0a9c0-kube-api-access-4z9b2\") pod \"keystone-6a25-account-create-update-tn8bx\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.236401 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdps4\" (UniqueName: \"kubernetes.io/projected/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-kube-api-access-sdps4\") pod \"keystone-db-create-pmt5n\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.236487 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc955d4-a981-42e4-b327-558d95c0a9c0-operator-scripts\") pod \"keystone-6a25-account-create-update-tn8bx\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.242712 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc955d4-a981-42e4-b327-558d95c0a9c0-operator-scripts\") pod \"keystone-6a25-account-create-update-tn8bx\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.251177 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6kxzb"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.252556 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.259719 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6kxzb"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.261746 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9b2\" (UniqueName: \"kubernetes.io/projected/8fc955d4-a981-42e4-b327-558d95c0a9c0-kube-api-access-4z9b2\") pod \"keystone-6a25-account-create-update-tn8bx\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.338134 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdps4\" (UniqueName: \"kubernetes.io/projected/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-kube-api-access-sdps4\") pod \"keystone-db-create-pmt5n\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.338268 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-operator-scripts\") pod \"keystone-db-create-pmt5n\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.339166 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-operator-scripts\") pod \"keystone-db-create-pmt5n\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.344336 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.347613 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e624-account-create-update-x5jhz"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.349008 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.351406 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.358477 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e624-account-create-update-x5jhz"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.370678 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdps4\" (UniqueName: \"kubernetes.io/projected/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-kube-api-access-sdps4\") pod \"keystone-db-create-pmt5n\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.372671 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.440042 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shdls\" (UniqueName: \"kubernetes.io/projected/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-kube-api-access-shdls\") pod \"placement-db-create-6kxzb\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.440187 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7jf\" (UniqueName: \"kubernetes.io/projected/a294d371-9e50-48f4-9774-bfb2f8014ea6-kube-api-access-sm7jf\") pod \"placement-e624-account-create-update-x5jhz\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.440321 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a294d371-9e50-48f4-9774-bfb2f8014ea6-operator-scripts\") pod \"placement-e624-account-create-update-x5jhz\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.440392 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-operator-scripts\") pod \"placement-db-create-6kxzb\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.541914 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7jf\" (UniqueName: \"kubernetes.io/projected/a294d371-9e50-48f4-9774-bfb2f8014ea6-kube-api-access-sm7jf\") pod \"placement-e624-account-create-update-x5jhz\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.542660 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a294d371-9e50-48f4-9774-bfb2f8014ea6-operator-scripts\") pod \"placement-e624-account-create-update-x5jhz\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.542698 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-operator-scripts\") pod \"placement-db-create-6kxzb\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.542795 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shdls\" (UniqueName: \"kubernetes.io/projected/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-kube-api-access-shdls\") pod \"placement-db-create-6kxzb\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.544178 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-operator-scripts\") pod \"placement-db-create-6kxzb\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.544596 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a294d371-9e50-48f4-9774-bfb2f8014ea6-operator-scripts\") pod \"placement-e624-account-create-update-x5jhz\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.561816 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7jf\" (UniqueName: \"kubernetes.io/projected/a294d371-9e50-48f4-9774-bfb2f8014ea6-kube-api-access-sm7jf\") pod \"placement-e624-account-create-update-x5jhz\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.570527 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shdls\" (UniqueName: \"kubernetes.io/projected/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-kube-api-access-shdls\") pod \"placement-db-create-6kxzb\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.658098 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.771756 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.829536 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6a25-account-create-update-tn8bx"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.937151 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pmt5n"] Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.964164 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerStarted","Data":"724c237984bc633fedf69b88fe8220d15d3dbd1aa807fdf1295dd0b034d151eb"} Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.965658 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pmt5n" event={"ID":"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e","Type":"ContainerStarted","Data":"e683c88e00a7477b0f9087f0bfdef23e25a3caa658b49f75768c6d46e992d563"} Dec 05 20:31:14 crc kubenswrapper[4904]: I1205 20:31:14.966793 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a25-account-create-update-tn8bx" event={"ID":"8fc955d4-a981-42e4-b327-558d95c0a9c0","Type":"ContainerStarted","Data":"1185b70098ce2f182083b6c65c7ca763a0f0286b7fa8f09c250df2f8ec5b25dc"} Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.000879 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=30.884207728 podStartE2EDuration="59.00085541s" podCreationTimestamp="2025-12-05 20:30:16 +0000 UTC" firstStartedPulling="2025-12-05 20:30:46.079844602 +0000 UTC m=+1144.891060711" lastFinishedPulling="2025-12-05 20:31:14.196492284 +0000 UTC m=+1173.007708393" observedRunningTime="2025-12-05 20:31:14.99218217 +0000 UTC m=+1173.803398289" watchObservedRunningTime="2025-12-05 20:31:15.00085541 +0000 UTC m=+1173.812071519" Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.103046 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6kxzb"] Dec 05 20:31:15 crc kubenswrapper[4904]: W1205 20:31:15.108563 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c5a51a5_3628_495b_a5c2_f29846ae9eb4.slice/crio-3b170722dd2131cbb6c97788d8d290c3d61b9ed81276b3da3641582b93607893 WatchSource:0}: Error finding container 3b170722dd2131cbb6c97788d8d290c3d61b9ed81276b3da3641582b93607893: Status 404 returned error can't find the container with id 3b170722dd2131cbb6c97788d8d290c3d61b9ed81276b3da3641582b93607893 Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.248242 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e624-account-create-update-x5jhz"] Dec 05 20:31:15 crc kubenswrapper[4904]: W1205 20:31:15.249659 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda294d371_9e50_48f4_9774_bfb2f8014ea6.slice/crio-e9d3c7491e0a5fdbbdef52cee7637823c2e5fa3b591bb7c22f7db009339dae55 WatchSource:0}: Error finding container e9d3c7491e0a5fdbbdef52cee7637823c2e5fa3b591bb7c22f7db009339dae55: Status 404 returned error can't find the container with id e9d3c7491e0a5fdbbdef52cee7637823c2e5fa3b591bb7c22f7db009339dae55 Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.981694 4904 generic.go:334] "Generic (PLEG): container finished" podID="7c5a51a5-3628-495b-a5c2-f29846ae9eb4" containerID="064a0fffb699f793417101d84460fa4df252d56d9f87f67532267d3d0adbade9" exitCode=0 Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.981813 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6kxzb" event={"ID":"7c5a51a5-3628-495b-a5c2-f29846ae9eb4","Type":"ContainerDied","Data":"064a0fffb699f793417101d84460fa4df252d56d9f87f67532267d3d0adbade9"} Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.982192 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6kxzb" event={"ID":"7c5a51a5-3628-495b-a5c2-f29846ae9eb4","Type":"ContainerStarted","Data":"3b170722dd2131cbb6c97788d8d290c3d61b9ed81276b3da3641582b93607893"} Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.987151 4904 generic.go:334] "Generic (PLEG): container finished" podID="a294d371-9e50-48f4-9774-bfb2f8014ea6" containerID="74c92f7975ed00a4276d5fb9fa6535e661fa9f9e875ee6fc1892a0ff240b406c" exitCode=0 Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.987258 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e624-account-create-update-x5jhz" event={"ID":"a294d371-9e50-48f4-9774-bfb2f8014ea6","Type":"ContainerDied","Data":"74c92f7975ed00a4276d5fb9fa6535e661fa9f9e875ee6fc1892a0ff240b406c"} Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.987299 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e624-account-create-update-x5jhz" event={"ID":"a294d371-9e50-48f4-9774-bfb2f8014ea6","Type":"ContainerStarted","Data":"e9d3c7491e0a5fdbbdef52cee7637823c2e5fa3b591bb7c22f7db009339dae55"} Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.991677 4904 generic.go:334] "Generic (PLEG): container finished" podID="d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" containerID="4a36c3a7b160a8b5550aed14e2a223314e9fb58c9017b32feabd2fad3ebbc524" exitCode=0 Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.991732 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pmt5n" event={"ID":"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e","Type":"ContainerDied","Data":"4a36c3a7b160a8b5550aed14e2a223314e9fb58c9017b32feabd2fad3ebbc524"} Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.997110 4904 generic.go:334] "Generic (PLEG): container finished" podID="8fc955d4-a981-42e4-b327-558d95c0a9c0" containerID="45576a418e493916e0c8affdd647f8bc9548341ad9c5f68a7ec2631889949e6d" exitCode=0 Dec 05 20:31:15 crc kubenswrapper[4904]: I1205 20:31:15.997184 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a25-account-create-update-tn8bx" event={"ID":"8fc955d4-a981-42e4-b327-558d95c0a9c0","Type":"ContainerDied","Data":"45576a418e493916e0c8affdd647f8bc9548341ad9c5f68a7ec2631889949e6d"} Dec 05 20:31:16 crc kubenswrapper[4904]: I1205 20:31:16.081908 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:16 crc kubenswrapper[4904]: E1205 20:31:16.082296 4904 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:31:16 crc kubenswrapper[4904]: E1205 20:31:16.082338 4904 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:31:16 crc kubenswrapper[4904]: E1205 20:31:16.082427 4904 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift podName:be5ed3b2-bc48-4865-ade5-f7c2e379a1ea nodeName:}" failed. No retries permitted until 2025-12-05 20:31:24.082396704 +0000 UTC m=+1182.893612853 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift") pod "swift-storage-0" (UID: "be5ed3b2-bc48-4865-ade5-f7c2e379a1ea") : configmap "swift-ring-files" not found Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.068372 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-bg2sh"] Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.070227 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.084141 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-bg2sh"] Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.188158 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-7ffc-account-create-update-wbqpp"] Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.189686 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.191989 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.197896 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4852903-4c6d-47bf-a507-0c1f700d3470-operator-scripts\") pod \"watcher-db-create-bg2sh\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.197998 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lws54\" (UniqueName: \"kubernetes.io/projected/c4852903-4c6d-47bf-a507-0c1f700d3470-kube-api-access-lws54\") pod \"watcher-db-create-bg2sh\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.204036 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-7ffc-account-create-update-wbqpp"] Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.301992 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t227\" (UniqueName: \"kubernetes.io/projected/346e1889-b00c-462f-8b7f-dbcbc159f6ae-kube-api-access-8t227\") pod \"watcher-7ffc-account-create-update-wbqpp\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.302116 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4852903-4c6d-47bf-a507-0c1f700d3470-operator-scripts\") pod \"watcher-db-create-bg2sh\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.302152 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/346e1889-b00c-462f-8b7f-dbcbc159f6ae-operator-scripts\") pod \"watcher-7ffc-account-create-update-wbqpp\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.302192 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lws54\" (UniqueName: \"kubernetes.io/projected/c4852903-4c6d-47bf-a507-0c1f700d3470-kube-api-access-lws54\") pod \"watcher-db-create-bg2sh\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.303329 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4852903-4c6d-47bf-a507-0c1f700d3470-operator-scripts\") pod \"watcher-db-create-bg2sh\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.325552 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lws54\" (UniqueName: \"kubernetes.io/projected/c4852903-4c6d-47bf-a507-0c1f700d3470-kube-api-access-lws54\") pod \"watcher-db-create-bg2sh\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.397051 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.404318 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t227\" (UniqueName: \"kubernetes.io/projected/346e1889-b00c-462f-8b7f-dbcbc159f6ae-kube-api-access-8t227\") pod \"watcher-7ffc-account-create-update-wbqpp\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.404437 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/346e1889-b00c-462f-8b7f-dbcbc159f6ae-operator-scripts\") pod \"watcher-7ffc-account-create-update-wbqpp\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.405500 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/346e1889-b00c-462f-8b7f-dbcbc159f6ae-operator-scripts\") pod \"watcher-7ffc-account-create-update-wbqpp\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.422833 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t227\" (UniqueName: \"kubernetes.io/projected/346e1889-b00c-462f-8b7f-dbcbc159f6ae-kube-api-access-8t227\") pod \"watcher-7ffc-account-create-update-wbqpp\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.506429 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.528677 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.543196 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.640005 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d9656c78f-lhs4v"] Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.642659 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerName="dnsmasq-dns" containerID="cri-o://d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961" gracePeriod=10 Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.711342 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-operator-scripts\") pod \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.711394 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdps4\" (UniqueName: \"kubernetes.io/projected/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-kube-api-access-sdps4\") pod \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\" (UID: \"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.712833 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" (UID: "d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.713610 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.722300 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-kube-api-access-sdps4" (OuterVolumeSpecName: "kube-api-access-sdps4") pod "d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" (UID: "d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e"). InnerVolumeSpecName "kube-api-access-sdps4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.746628 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.752243 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.760335 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.814390 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z9b2\" (UniqueName: \"kubernetes.io/projected/8fc955d4-a981-42e4-b327-558d95c0a9c0-kube-api-access-4z9b2\") pod \"8fc955d4-a981-42e4-b327-558d95c0a9c0\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.814547 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc955d4-a981-42e4-b327-558d95c0a9c0-operator-scripts\") pod \"8fc955d4-a981-42e4-b327-558d95c0a9c0\" (UID: \"8fc955d4-a981-42e4-b327-558d95c0a9c0\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.815099 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdps4\" (UniqueName: \"kubernetes.io/projected/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e-kube-api-access-sdps4\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.815138 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc955d4-a981-42e4-b327-558d95c0a9c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fc955d4-a981-42e4-b327-558d95c0a9c0" (UID: "8fc955d4-a981-42e4-b327-558d95c0a9c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.817320 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc955d4-a981-42e4-b327-558d95c0a9c0-kube-api-access-4z9b2" (OuterVolumeSpecName: "kube-api-access-4z9b2") pod "8fc955d4-a981-42e4-b327-558d95c0a9c0" (UID: "8fc955d4-a981-42e4-b327-558d95c0a9c0"). InnerVolumeSpecName "kube-api-access-4z9b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: E1205 20:31:17.820141 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ba79e6_1b6a_4dd5_b2c5_8b5d2a7264ca.slice/crio-conmon-d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ba79e6_1b6a_4dd5_b2c5_8b5d2a7264ca.slice/crio-d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.916487 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shdls\" (UniqueName: \"kubernetes.io/projected/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-kube-api-access-shdls\") pod \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.916825 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a294d371-9e50-48f4-9774-bfb2f8014ea6-operator-scripts\") pod \"a294d371-9e50-48f4-9774-bfb2f8014ea6\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.916866 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-operator-scripts\") pod \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\" (UID: \"7c5a51a5-3628-495b-a5c2-f29846ae9eb4\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.916967 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm7jf\" (UniqueName: \"kubernetes.io/projected/a294d371-9e50-48f4-9774-bfb2f8014ea6-kube-api-access-sm7jf\") pod \"a294d371-9e50-48f4-9774-bfb2f8014ea6\" (UID: \"a294d371-9e50-48f4-9774-bfb2f8014ea6\") " Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.917321 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc955d4-a981-42e4-b327-558d95c0a9c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.917340 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z9b2\" (UniqueName: \"kubernetes.io/projected/8fc955d4-a981-42e4-b327-558d95c0a9c0-kube-api-access-4z9b2\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.917460 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c5a51a5-3628-495b-a5c2-f29846ae9eb4" (UID: "7c5a51a5-3628-495b-a5c2-f29846ae9eb4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.917548 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a294d371-9e50-48f4-9774-bfb2f8014ea6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a294d371-9e50-48f4-9774-bfb2f8014ea6" (UID: "a294d371-9e50-48f4-9774-bfb2f8014ea6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.920144 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-kube-api-access-shdls" (OuterVolumeSpecName: "kube-api-access-shdls") pod "7c5a51a5-3628-495b-a5c2-f29846ae9eb4" (UID: "7c5a51a5-3628-495b-a5c2-f29846ae9eb4"). InnerVolumeSpecName "kube-api-access-shdls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:17 crc kubenswrapper[4904]: I1205 20:31:17.921438 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a294d371-9e50-48f4-9774-bfb2f8014ea6-kube-api-access-sm7jf" (OuterVolumeSpecName: "kube-api-access-sm7jf") pod "a294d371-9e50-48f4-9774-bfb2f8014ea6" (UID: "a294d371-9e50-48f4-9774-bfb2f8014ea6"). InnerVolumeSpecName "kube-api-access-sm7jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.017196 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e624-account-create-update-x5jhz" event={"ID":"a294d371-9e50-48f4-9774-bfb2f8014ea6","Type":"ContainerDied","Data":"e9d3c7491e0a5fdbbdef52cee7637823c2e5fa3b591bb7c22f7db009339dae55"} Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.017237 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9d3c7491e0a5fdbbdef52cee7637823c2e5fa3b591bb7c22f7db009339dae55" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.017289 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e624-account-create-update-x5jhz" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.023178 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a294d371-9e50-48f4-9774-bfb2f8014ea6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.023393 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.023407 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm7jf\" (UniqueName: \"kubernetes.io/projected/a294d371-9e50-48f4-9774-bfb2f8014ea6-kube-api-access-sm7jf\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.023420 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shdls\" (UniqueName: \"kubernetes.io/projected/7c5a51a5-3628-495b-a5c2-f29846ae9eb4-kube-api-access-shdls\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.023861 4904 generic.go:334] "Generic (PLEG): container finished" podID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerID="d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961" exitCode=0 Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.023932 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" event={"ID":"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca","Type":"ContainerDied","Data":"d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961"} Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.040310 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pmt5n" event={"ID":"d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e","Type":"ContainerDied","Data":"e683c88e00a7477b0f9087f0bfdef23e25a3caa658b49f75768c6d46e992d563"} Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.040356 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e683c88e00a7477b0f9087f0bfdef23e25a3caa658b49f75768c6d46e992d563" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.040417 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pmt5n" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.044669 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a25-account-create-update-tn8bx" event={"ID":"8fc955d4-a981-42e4-b327-558d95c0a9c0","Type":"ContainerDied","Data":"1185b70098ce2f182083b6c65c7ca763a0f0286b7fa8f09c250df2f8ec5b25dc"} Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.044711 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1185b70098ce2f182083b6c65c7ca763a0f0286b7fa8f09c250df2f8ec5b25dc" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.044772 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a25-account-create-update-tn8bx" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.057264 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6kxzb" event={"ID":"7c5a51a5-3628-495b-a5c2-f29846ae9eb4","Type":"ContainerDied","Data":"3b170722dd2131cbb6c97788d8d290c3d61b9ed81276b3da3641582b93607893"} Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.057297 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b170722dd2131cbb6c97788d8d290c3d61b9ed81276b3da3641582b93607893" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.057344 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6kxzb" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.060283 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-bg2sh"] Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.155983 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-7ffc-account-create-update-wbqpp"] Dec 05 20:31:18 crc kubenswrapper[4904]: W1205 20:31:18.183734 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346e1889_b00c_462f_8b7f_dbcbc159f6ae.slice/crio-2736a5010321db30573505b4a44149173cb759b9f307c7e6603f70ce0936fafd WatchSource:0}: Error finding container 2736a5010321db30573505b4a44149173cb759b9f307c7e6603f70ce0936fafd: Status 404 returned error can't find the container with id 2736a5010321db30573505b4a44149173cb759b9f307c7e6603f70ce0936fafd Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.195779 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.335311 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-dns-svc\") pod \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.335767 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9t85\" (UniqueName: \"kubernetes.io/projected/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-kube-api-access-n9t85\") pod \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.335794 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-config\") pod \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\" (UID: \"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca\") " Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.340944 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-kube-api-access-n9t85" (OuterVolumeSpecName: "kube-api-access-n9t85") pod "a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" (UID: "a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca"). InnerVolumeSpecName "kube-api-access-n9t85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.387615 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" (UID: "a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.391885 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-config" (OuterVolumeSpecName: "config") pod "a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" (UID: "a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.438000 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9t85\" (UniqueName: \"kubernetes.io/projected/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-kube-api-access-n9t85\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.438035 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.438043 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.665395 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.665484 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:18 crc kubenswrapper[4904]: I1205 20:31:18.667825 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.069903 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" event={"ID":"a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca","Type":"ContainerDied","Data":"d32dd19148976f7b6767803164ee2ccd3ef21cb083f0ba5f62355241f224b868"} Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.070444 4904 scope.go:117] "RemoveContainer" containerID="d44842850ffe4f9dbd70ecda97155459c9466488268eceaf097897b8f9910961" Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.069933 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9656c78f-lhs4v" Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.072455 4904 generic.go:334] "Generic (PLEG): container finished" podID="346e1889-b00c-462f-8b7f-dbcbc159f6ae" containerID="7c29d4eeb5a10031257ed9b236349bc927411c71caa97fc8e0a4f7875f7ef270" exitCode=0 Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.072511 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-7ffc-account-create-update-wbqpp" event={"ID":"346e1889-b00c-462f-8b7f-dbcbc159f6ae","Type":"ContainerDied","Data":"7c29d4eeb5a10031257ed9b236349bc927411c71caa97fc8e0a4f7875f7ef270"} Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.072577 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-7ffc-account-create-update-wbqpp" event={"ID":"346e1889-b00c-462f-8b7f-dbcbc159f6ae","Type":"ContainerStarted","Data":"2736a5010321db30573505b4a44149173cb759b9f307c7e6603f70ce0936fafd"} Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.074919 4904 generic.go:334] "Generic (PLEG): container finished" podID="c4852903-4c6d-47bf-a507-0c1f700d3470" containerID="9111b5913060486a686f0e686fb8e869fc80f8121c854a0c00b4d9dd70c0de4c" exitCode=0 Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.074973 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-bg2sh" event={"ID":"c4852903-4c6d-47bf-a507-0c1f700d3470","Type":"ContainerDied","Data":"9111b5913060486a686f0e686fb8e869fc80f8121c854a0c00b4d9dd70c0de4c"} Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.075096 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-bg2sh" event={"ID":"c4852903-4c6d-47bf-a507-0c1f700d3470","Type":"ContainerStarted","Data":"7f47f3861e70b559cb668c880ea4335b950928cece21875d222c51afb3e60746"} Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.076535 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.105359 4904 scope.go:117] "RemoveContainer" containerID="5ac5f16656517316de02f88acea225273b85b2902b89b16f7c05a75a542c2c57" Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.158595 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d9656c78f-lhs4v"] Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.167987 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d9656c78f-lhs4v"] Dec 05 20:31:19 crc kubenswrapper[4904]: I1205 20:31:19.690710 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" path="/var/lib/kubelet/pods/a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca/volumes" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.523462 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.531468 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.675539 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4852903-4c6d-47bf-a507-0c1f700d3470-operator-scripts\") pod \"c4852903-4c6d-47bf-a507-0c1f700d3470\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.675634 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t227\" (UniqueName: \"kubernetes.io/projected/346e1889-b00c-462f-8b7f-dbcbc159f6ae-kube-api-access-8t227\") pod \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.675657 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lws54\" (UniqueName: \"kubernetes.io/projected/c4852903-4c6d-47bf-a507-0c1f700d3470-kube-api-access-lws54\") pod \"c4852903-4c6d-47bf-a507-0c1f700d3470\" (UID: \"c4852903-4c6d-47bf-a507-0c1f700d3470\") " Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.675780 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/346e1889-b00c-462f-8b7f-dbcbc159f6ae-operator-scripts\") pod \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\" (UID: \"346e1889-b00c-462f-8b7f-dbcbc159f6ae\") " Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.675987 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4852903-4c6d-47bf-a507-0c1f700d3470-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4852903-4c6d-47bf-a507-0c1f700d3470" (UID: "c4852903-4c6d-47bf-a507-0c1f700d3470"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.676111 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4852903-4c6d-47bf-a507-0c1f700d3470-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.676518 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/346e1889-b00c-462f-8b7f-dbcbc159f6ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "346e1889-b00c-462f-8b7f-dbcbc159f6ae" (UID: "346e1889-b00c-462f-8b7f-dbcbc159f6ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.681636 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4852903-4c6d-47bf-a507-0c1f700d3470-kube-api-access-lws54" (OuterVolumeSpecName: "kube-api-access-lws54") pod "c4852903-4c6d-47bf-a507-0c1f700d3470" (UID: "c4852903-4c6d-47bf-a507-0c1f700d3470"). InnerVolumeSpecName "kube-api-access-lws54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.697335 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346e1889-b00c-462f-8b7f-dbcbc159f6ae-kube-api-access-8t227" (OuterVolumeSpecName: "kube-api-access-8t227") pod "346e1889-b00c-462f-8b7f-dbcbc159f6ae" (UID: "346e1889-b00c-462f-8b7f-dbcbc159f6ae"). InnerVolumeSpecName "kube-api-access-8t227". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.777780 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/346e1889-b00c-462f-8b7f-dbcbc159f6ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.777816 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lws54\" (UniqueName: \"kubernetes.io/projected/c4852903-4c6d-47bf-a507-0c1f700d3470-kube-api-access-lws54\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:20 crc kubenswrapper[4904]: I1205 20:31:20.777827 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t227\" (UniqueName: \"kubernetes.io/projected/346e1889-b00c-462f-8b7f-dbcbc159f6ae-kube-api-access-8t227\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.063966 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.094515 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-7ffc-account-create-update-wbqpp" event={"ID":"346e1889-b00c-462f-8b7f-dbcbc159f6ae","Type":"ContainerDied","Data":"2736a5010321db30573505b4a44149173cb759b9f307c7e6603f70ce0936fafd"} Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.094569 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2736a5010321db30573505b4a44149173cb759b9f307c7e6603f70ce0936fafd" Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.094568 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7ffc-account-create-update-wbqpp" Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.096338 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bg2sh" Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.096363 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-bg2sh" event={"ID":"c4852903-4c6d-47bf-a507-0c1f700d3470","Type":"ContainerDied","Data":"7f47f3861e70b559cb668c880ea4335b950928cece21875d222c51afb3e60746"} Dec 05 20:31:21 crc kubenswrapper[4904]: I1205 20:31:21.096416 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f47f3861e70b559cb668c880ea4335b950928cece21875d222c51afb3e60746" Dec 05 20:31:22 crc kubenswrapper[4904]: I1205 20:31:22.107042 4904 generic.go:334] "Generic (PLEG): container finished" podID="4b80a797-4212-4242-81fd-928045b629cd" containerID="4ed1029d0887f45b89cad83436ce354b990d2e271be387d893f8305d23e56642" exitCode=0 Dec 05 20:31:22 crc kubenswrapper[4904]: I1205 20:31:22.107098 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9xq46" event={"ID":"4b80a797-4212-4242-81fd-928045b629cd","Type":"ContainerDied","Data":"4ed1029d0887f45b89cad83436ce354b990d2e271be387d893f8305d23e56642"} Dec 05 20:31:22 crc kubenswrapper[4904]: I1205 20:31:22.107514 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="prometheus" containerID="cri-o://bc6bd55a707cd7c462889940bc35e995374b4c3995de80745ca6bb1ed256e0d0" gracePeriod=600 Dec 05 20:31:22 crc kubenswrapper[4904]: I1205 20:31:22.107578 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="thanos-sidecar" containerID="cri-o://724c237984bc633fedf69b88fe8220d15d3dbd1aa807fdf1295dd0b034d151eb" gracePeriod=600 Dec 05 20:31:22 crc kubenswrapper[4904]: I1205 20:31:22.107612 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="config-reloader" containerID="cri-o://9bd6982949de74604d2b98a2e96167ee2081d2881e85a4cf5ec787dfedaa9751" gracePeriod=600 Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133624 4904 generic.go:334] "Generic (PLEG): container finished" podID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerID="724c237984bc633fedf69b88fe8220d15d3dbd1aa807fdf1295dd0b034d151eb" exitCode=0 Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133894 4904 generic.go:334] "Generic (PLEG): container finished" podID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerID="9bd6982949de74604d2b98a2e96167ee2081d2881e85a4cf5ec787dfedaa9751" exitCode=0 Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133902 4904 generic.go:334] "Generic (PLEG): container finished" podID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerID="bc6bd55a707cd7c462889940bc35e995374b4c3995de80745ca6bb1ed256e0d0" exitCode=0 Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133724 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerDied","Data":"724c237984bc633fedf69b88fe8220d15d3dbd1aa807fdf1295dd0b034d151eb"} Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133952 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerDied","Data":"9bd6982949de74604d2b98a2e96167ee2081d2881e85a4cf5ec787dfedaa9751"} Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133977 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerDied","Data":"bc6bd55a707cd7c462889940bc35e995374b4c3995de80745ca6bb1ed256e0d0"} Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.133995 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c241183e-3f0b-4dc2-88d2-ea990c6dc71d","Type":"ContainerDied","Data":"425abf0d293523494d0e0a7f6d1c19228a2d8e562a71abe31a9351619b7a75f0"} Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.134015 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="425abf0d293523494d0e0a7f6d1c19228a2d8e562a71abe31a9351619b7a75f0" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.149090 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.229884 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-tls-assets\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230111 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-web-config\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230167 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config-out\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230233 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230298 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-thanos-prometheus-http-client-file\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230453 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230494 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-prometheus-metric-storage-rulefiles-0\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.230538 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6qcr\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-kube-api-access-m6qcr\") pod \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\" (UID: \"c241183e-3f0b-4dc2-88d2-ea990c6dc71d\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.232184 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.236723 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config" (OuterVolumeSpecName: "config") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.238704 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.246520 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.249122 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-kube-api-access-m6qcr" (OuterVolumeSpecName: "kube-api-access-m6qcr") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "kube-api-access-m6qcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.254255 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "pvc-b90305fc-6b85-4b0e-958e-59ae1d530558". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.263197 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config-out" (OuterVolumeSpecName: "config-out") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.264695 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-web-config" (OuterVolumeSpecName: "web-config") pod "c241183e-3f0b-4dc2-88d2-ea990c6dc71d" (UID: "c241183e-3f0b-4dc2-88d2-ea990c6dc71d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332793 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332823 4904 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332877 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") on node \"crc\" " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332890 4904 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332899 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6qcr\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-kube-api-access-m6qcr\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332908 4904 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332916 4904 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.332923 4904 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c241183e-3f0b-4dc2-88d2-ea990c6dc71d-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.352655 4904 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.352789 4904 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b90305fc-6b85-4b0e-958e-59ae1d530558" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558") on node "crc" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.435077 4904 reconciler_common.go:293] "Volume detached for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.517031 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638532 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-scripts\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638606 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-ring-data-devices\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638690 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwp9b\" (UniqueName: \"kubernetes.io/projected/4b80a797-4212-4242-81fd-928045b629cd-kube-api-access-fwp9b\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638746 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-swiftconf\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638773 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-combined-ca-bundle\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638797 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b80a797-4212-4242-81fd-928045b629cd-etc-swift\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.638834 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-dispersionconf\") pod \"4b80a797-4212-4242-81fd-928045b629cd\" (UID: \"4b80a797-4212-4242-81fd-928045b629cd\") " Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.639996 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.640731 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b80a797-4212-4242-81fd-928045b629cd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.644368 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.656734 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b80a797-4212-4242-81fd-928045b629cd-kube-api-access-fwp9b" (OuterVolumeSpecName: "kube-api-access-fwp9b") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "kube-api-access-fwp9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.658512 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-scripts" (OuterVolumeSpecName: "scripts") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.659686 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.660032 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b80a797-4212-4242-81fd-928045b629cd" (UID: "4b80a797-4212-4242-81fd-928045b629cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.740972 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwp9b\" (UniqueName: \"kubernetes.io/projected/4b80a797-4212-4242-81fd-928045b629cd-kube-api-access-fwp9b\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.741006 4904 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.741015 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.741023 4904 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b80a797-4212-4242-81fd-928045b629cd-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.741032 4904 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b80a797-4212-4242-81fd-928045b629cd-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.741039 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:23 crc kubenswrapper[4904]: I1205 20:31:23.741047 4904 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b80a797-4212-4242-81fd-928045b629cd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.144157 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.144191 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9xq46" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.144185 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9xq46" event={"ID":"4b80a797-4212-4242-81fd-928045b629cd","Type":"ContainerDied","Data":"ac9b9bb7daf1831e3e48a83dcb12c614c10a0369140f036d6214d72bbf565933"} Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.144756 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac9b9bb7daf1831e3e48a83dcb12c614c10a0369140f036d6214d72bbf565933" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.147361 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.162165 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be5ed3b2-bc48-4865-ade5-f7c2e379a1ea-etc-swift\") pod \"swift-storage-0\" (UID: \"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea\") " pod="openstack/swift-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.181132 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.192111 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.205436 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.214731 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215164 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5a51a5-3628-495b-a5c2-f29846ae9eb4" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215196 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5a51a5-3628-495b-a5c2-f29846ae9eb4" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215213 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="config-reloader" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215221 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="config-reloader" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215235 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346e1889-b00c-462f-8b7f-dbcbc159f6ae" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215243 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="346e1889-b00c-462f-8b7f-dbcbc159f6ae" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215255 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215263 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215280 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b80a797-4212-4242-81fd-928045b629cd" containerName="swift-ring-rebalance" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215287 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b80a797-4212-4242-81fd-928045b629cd" containerName="swift-ring-rebalance" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215300 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="init-config-reloader" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215308 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="init-config-reloader" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215322 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerName="init" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215329 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerName="init" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215342 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="prometheus" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215350 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="prometheus" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215362 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerName="dnsmasq-dns" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215369 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerName="dnsmasq-dns" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215390 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4852903-4c6d-47bf-a507-0c1f700d3470" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215398 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4852903-4c6d-47bf-a507-0c1f700d3470" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215412 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="thanos-sidecar" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215419 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="thanos-sidecar" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215429 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc955d4-a981-42e4-b327-558d95c0a9c0" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215437 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc955d4-a981-42e4-b327-558d95c0a9c0" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: E1205 20:31:24.215448 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a294d371-9e50-48f4-9774-bfb2f8014ea6" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215457 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a294d371-9e50-48f4-9774-bfb2f8014ea6" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215656 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc955d4-a981-42e4-b327-558d95c0a9c0" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215678 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5a51a5-3628-495b-a5c2-f29846ae9eb4" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215690 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="config-reloader" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215712 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="thanos-sidecar" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215731 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b80a797-4212-4242-81fd-928045b629cd" containerName="swift-ring-rebalance" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215742 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a294d371-9e50-48f4-9774-bfb2f8014ea6" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215760 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4852903-4c6d-47bf-a507-0c1f700d3470" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215780 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="346e1889-b00c-462f-8b7f-dbcbc159f6ae" containerName="mariadb-account-create-update" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215793 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ba79e6-1b6a-4dd5-b2c5-8b5d2a7264ca" containerName="dnsmasq-dns" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215802 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" containerName="mariadb-database-create" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.215812 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" containerName="prometheus" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.217759 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.222763 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.223075 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-c2mnb" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.223189 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.223568 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.223673 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.225511 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.232993 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.237996 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.350587 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.350641 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.350719 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.350781 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99273c11-759d-4f83-a065-bf008d6a110f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.350860 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.350915 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.351039 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xdfl\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-kube-api-access-9xdfl\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.351152 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99273c11-759d-4f83-a065-bf008d6a110f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.351231 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.351271 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-config\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.351299 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.453336 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.455826 4904 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.455862 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c2522b7d526507cd9f6194376dadc5aee47822a6206b438630738242eaba537/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456167 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456261 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xdfl\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-kube-api-access-9xdfl\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456348 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99273c11-759d-4f83-a065-bf008d6a110f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456417 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456474 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-config\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456516 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456568 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456602 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456658 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.456692 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99273c11-759d-4f83-a065-bf008d6a110f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.457813 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99273c11-759d-4f83-a065-bf008d6a110f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.458889 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.459689 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.460082 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.460628 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.461192 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.461233 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-config\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.467019 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99273c11-759d-4f83-a065-bf008d6a110f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.467042 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.481680 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xdfl\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-kube-api-access-9xdfl\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.496881 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.534633 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.791186 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 20:31:24 crc kubenswrapper[4904]: I1205 20:31:24.984426 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.152252 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"a981d9ec7e8a7c679a5567b4a1cf9b5f9d4659d3098fed38f2299af127d56d1e"} Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.539387 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hlxgq" podUID="41ef8df6-e0e1-45a5-954d-10ce99fa26de" containerName="ovn-controller" probeResult="failure" output=< Dec 05 20:31:25 crc kubenswrapper[4904]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 20:31:25 crc kubenswrapper[4904]: > Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.616888 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.617025 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-h7djt" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.690958 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c241183e-3f0b-4dc2-88d2-ea990c6dc71d" path="/var/lib/kubelet/pods/c241183e-3f0b-4dc2-88d2-ea990c6dc71d/volumes" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.808293 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hlxgq-config-7lh27"] Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.809714 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.812080 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.816205 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlxgq-config-7lh27"] Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.880084 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-log-ovn\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.880161 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run-ovn\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.880201 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.880260 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jm28\" (UniqueName: \"kubernetes.io/projected/f23d7afd-986c-4835-93d5-f19f06624f66-kube-api-access-9jm28\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.880323 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-additional-scripts\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.880353 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-scripts\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982011 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run-ovn\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982080 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982130 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jm28\" (UniqueName: \"kubernetes.io/projected/f23d7afd-986c-4835-93d5-f19f06624f66-kube-api-access-9jm28\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982179 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-additional-scripts\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982202 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-scripts\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982301 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-log-ovn\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982587 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-log-ovn\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982649 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run-ovn\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.982697 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.983714 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-additional-scripts\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:25 crc kubenswrapper[4904]: I1205 20:31:25.986022 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-scripts\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:26 crc kubenswrapper[4904]: I1205 20:31:26.000524 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jm28\" (UniqueName: \"kubernetes.io/projected/f23d7afd-986c-4835-93d5-f19f06624f66-kube-api-access-9jm28\") pod \"ovn-controller-hlxgq-config-7lh27\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:26 crc kubenswrapper[4904]: I1205 20:31:26.134588 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:26 crc kubenswrapper[4904]: W1205 20:31:26.848146 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99273c11_759d_4f83_a065_bf008d6a110f.slice/crio-994c214c3ff01658a0503405b872de1da98a41dd643b14998fe3391201add387 WatchSource:0}: Error finding container 994c214c3ff01658a0503405b872de1da98a41dd643b14998fe3391201add387: Status 404 returned error can't find the container with id 994c214c3ff01658a0503405b872de1da98a41dd643b14998fe3391201add387 Dec 05 20:31:27 crc kubenswrapper[4904]: I1205 20:31:27.177504 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"4ed1e7b606ea7b56118e165105ccecd70b13dca5672418940386258ce72f81de"} Dec 05 20:31:27 crc kubenswrapper[4904]: I1205 20:31:27.178737 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerStarted","Data":"994c214c3ff01658a0503405b872de1da98a41dd643b14998fe3391201add387"} Dec 05 20:31:27 crc kubenswrapper[4904]: I1205 20:31:27.290954 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlxgq-config-7lh27"] Dec 05 20:31:28 crc kubenswrapper[4904]: E1205 20:31:28.077255 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23d7afd_986c_4835_93d5_f19f06624f66.slice/crio-conmon-5c6250321ba72cb24e5f7c65d6de2e6e678f16ef9ef5b36c9bbe06fcd8b2073e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:31:28 crc kubenswrapper[4904]: I1205 20:31:28.188079 4904 generic.go:334] "Generic (PLEG): container finished" podID="f23d7afd-986c-4835-93d5-f19f06624f66" containerID="5c6250321ba72cb24e5f7c65d6de2e6e678f16ef9ef5b36c9bbe06fcd8b2073e" exitCode=0 Dec 05 20:31:28 crc kubenswrapper[4904]: I1205 20:31:28.188442 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq-config-7lh27" event={"ID":"f23d7afd-986c-4835-93d5-f19f06624f66","Type":"ContainerDied","Data":"5c6250321ba72cb24e5f7c65d6de2e6e678f16ef9ef5b36c9bbe06fcd8b2073e"} Dec 05 20:31:28 crc kubenswrapper[4904]: I1205 20:31:28.188471 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq-config-7lh27" event={"ID":"f23d7afd-986c-4835-93d5-f19f06624f66","Type":"ContainerStarted","Data":"258357f66bf7d7677dd7ecd4234a047c93dcd5dd0ad22d3d79206f4034f7ac41"} Dec 05 20:31:28 crc kubenswrapper[4904]: I1205 20:31:28.191344 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"ce21dc4c0c7843359d51adb268770a822262ff85cd87d03d424bbbd87ab00ff4"} Dec 05 20:31:28 crc kubenswrapper[4904]: I1205 20:31:28.191372 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"01ad883d1b09d1b2227957c5917dc1e8454309fcc4d0e7f5e1402a5ee8cbeb05"} Dec 05 20:31:28 crc kubenswrapper[4904]: I1205 20:31:28.191381 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"b9ae965441a6343e216875aec738c2535ccd2a33f8a2ba833e8157d59299128f"} Dec 05 20:31:29 crc kubenswrapper[4904]: I1205 20:31:29.863819 4904 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd85f3a5d-f238-475c-9994-ac5c7fe16490"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd85f3a5d-f238-475c-9994-ac5c7fe16490] : Timed out while waiting for systemd to remove kubepods-besteffort-podd85f3a5d_f238_475c_9994_ac5c7fe16490.slice" Dec 05 20:31:30 crc kubenswrapper[4904]: I1205 20:31:30.242316 4904 generic.go:334] "Generic (PLEG): container finished" podID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerID="2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a" exitCode=0 Dec 05 20:31:30 crc kubenswrapper[4904]: I1205 20:31:30.242403 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"921f0ddc-4d15-4bfa-9560-7a01eaa3461f","Type":"ContainerDied","Data":"2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a"} Dec 05 20:31:30 crc kubenswrapper[4904]: I1205 20:31:30.531603 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hlxgq" Dec 05 20:31:32 crc kubenswrapper[4904]: I1205 20:31:32.269786 4904 generic.go:334] "Generic (PLEG): container finished" podID="0caaad94-d02e-43da-bf3b-087a5ec8d2f8" containerID="05867c3143e7e550817d33a46cd977302277bd1126239c8ac6e35fcfc5645f6d" exitCode=0 Dec 05 20:31:32 crc kubenswrapper[4904]: I1205 20:31:32.269928 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"0caaad94-d02e-43da-bf3b-087a5ec8d2f8","Type":"ContainerDied","Data":"05867c3143e7e550817d33a46cd977302277bd1126239c8ac6e35fcfc5645f6d"} Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.563160 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625349 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run-ovn\") pod \"f23d7afd-986c-4835-93d5-f19f06624f66\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625414 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-log-ovn\") pod \"f23d7afd-986c-4835-93d5-f19f06624f66\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625454 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run\") pod \"f23d7afd-986c-4835-93d5-f19f06624f66\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625486 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-additional-scripts\") pod \"f23d7afd-986c-4835-93d5-f19f06624f66\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625492 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f23d7afd-986c-4835-93d5-f19f06624f66" (UID: "f23d7afd-986c-4835-93d5-f19f06624f66"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625503 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-scripts\") pod \"f23d7afd-986c-4835-93d5-f19f06624f66\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625492 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f23d7afd-986c-4835-93d5-f19f06624f66" (UID: "f23d7afd-986c-4835-93d5-f19f06624f66"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625518 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run" (OuterVolumeSpecName: "var-run") pod "f23d7afd-986c-4835-93d5-f19f06624f66" (UID: "f23d7afd-986c-4835-93d5-f19f06624f66"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625528 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jm28\" (UniqueName: \"kubernetes.io/projected/f23d7afd-986c-4835-93d5-f19f06624f66-kube-api-access-9jm28\") pod \"f23d7afd-986c-4835-93d5-f19f06624f66\" (UID: \"f23d7afd-986c-4835-93d5-f19f06624f66\") " Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625869 4904 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625885 4904 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.625896 4904 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f23d7afd-986c-4835-93d5-f19f06624f66-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.626221 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f23d7afd-986c-4835-93d5-f19f06624f66" (UID: "f23d7afd-986c-4835-93d5-f19f06624f66"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.626454 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-scripts" (OuterVolumeSpecName: "scripts") pod "f23d7afd-986c-4835-93d5-f19f06624f66" (UID: "f23d7afd-986c-4835-93d5-f19f06624f66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.630428 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23d7afd-986c-4835-93d5-f19f06624f66-kube-api-access-9jm28" (OuterVolumeSpecName: "kube-api-access-9jm28") pod "f23d7afd-986c-4835-93d5-f19f06624f66" (UID: "f23d7afd-986c-4835-93d5-f19f06624f66"). InnerVolumeSpecName "kube-api-access-9jm28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.727621 4904 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.727661 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f23d7afd-986c-4835-93d5-f19f06624f66-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4904]: I1205 20:31:33.727670 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jm28\" (UniqueName: \"kubernetes.io/projected/f23d7afd-986c-4835-93d5-f19f06624f66-kube-api-access-9jm28\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.292520 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"921f0ddc-4d15-4bfa-9560-7a01eaa3461f","Type":"ContainerStarted","Data":"3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.292734 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.296905 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-7lh27" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.296896 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq-config-7lh27" event={"ID":"f23d7afd-986c-4835-93d5-f19f06624f66","Type":"ContainerDied","Data":"258357f66bf7d7677dd7ecd4234a047c93dcd5dd0ad22d3d79206f4034f7ac41"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.297152 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258357f66bf7d7677dd7ecd4234a047c93dcd5dd0ad22d3d79206f4034f7ac41" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.301304 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"0caaad94-d02e-43da-bf3b-087a5ec8d2f8","Type":"ContainerStarted","Data":"7fbf7283e3519978029ead47d66f5017681771536d923dd38998a09e64570e64"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.301502 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.309633 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"38ff0fe02cdb1cca5b410fc03fc0a959aefb14512d307b3bdc7f99a0015fc45c"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.309675 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"971533a233d6d0df828798da7308b8509fbb32c3abefb67a56fa421995d7fde6"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.309686 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"d6c5346f27c3ac6ed62416ca3cca1b7c5919f9f55fbe9462079064d26ad001fb"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.309694 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"83e6fd840198313070f7c8435c6b7d618cae45ec572ba35afba07512f3ea815d"} Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.322872 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.847676122 podStartE2EDuration="1m25.322857354s" podCreationTimestamp="2025-12-05 20:30:09 +0000 UTC" firstStartedPulling="2025-12-05 20:30:11.25857944 +0000 UTC m=+1110.069795549" lastFinishedPulling="2025-12-05 20:30:55.733760672 +0000 UTC m=+1154.544976781" observedRunningTime="2025-12-05 20:31:34.318853182 +0000 UTC m=+1193.130069301" watchObservedRunningTime="2025-12-05 20:31:34.322857354 +0000 UTC m=+1193.134073463" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.348694 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=-9223371952.506096 podStartE2EDuration="1m24.348679301s" podCreationTimestamp="2025-12-05 20:30:10 +0000 UTC" firstStartedPulling="2025-12-05 20:30:11.992312625 +0000 UTC m=+1110.803528734" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:34.346342976 +0000 UTC m=+1193.157559105" watchObservedRunningTime="2025-12-05 20:31:34.348679301 +0000 UTC m=+1193.159895410" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.678432 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hlxgq-config-7lh27"] Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.689261 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hlxgq-config-7lh27"] Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.778970 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hlxgq-config-cp55m"] Dec 05 20:31:34 crc kubenswrapper[4904]: E1205 20:31:34.779656 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23d7afd-986c-4835-93d5-f19f06624f66" containerName="ovn-config" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.779753 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23d7afd-986c-4835-93d5-f19f06624f66" containerName="ovn-config" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.780029 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23d7afd-986c-4835-93d5-f19f06624f66" containerName="ovn-config" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.781714 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.783504 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.805893 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlxgq-config-cp55m"] Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.845424 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run-ovn\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.845527 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.845578 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-additional-scripts\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.845605 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-scripts\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.845649 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5pln\" (UniqueName: \"kubernetes.io/projected/fcda485e-18b2-4536-a019-8922707d40dc-kube-api-access-z5pln\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.845719 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-log-ovn\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.947923 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.948588 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-additional-scripts\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.948716 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-scripts\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.948856 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5pln\" (UniqueName: \"kubernetes.io/projected/fcda485e-18b2-4536-a019-8922707d40dc-kube-api-access-z5pln\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.948971 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-log-ovn\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.949046 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-log-ovn\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.948411 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.949356 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run-ovn\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.949466 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run-ovn\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.951262 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-scripts\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.951775 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-additional-scripts\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:34 crc kubenswrapper[4904]: I1205 20:31:34.970107 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5pln\" (UniqueName: \"kubernetes.io/projected/fcda485e-18b2-4536-a019-8922707d40dc-kube-api-access-z5pln\") pod \"ovn-controller-hlxgq-config-cp55m\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.105946 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.320880 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerStarted","Data":"e3cd937f13d5690f46fe2b55e7bc6ccedc15fe2bd1e6fbaec5c9f90263b1a790"} Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.325694 4904 generic.go:334] "Generic (PLEG): container finished" podID="0ad24986-23a3-4010-8dcf-6778339691c8" containerID="430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65" exitCode=0 Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.325763 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ad24986-23a3-4010-8dcf-6778339691c8","Type":"ContainerDied","Data":"430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65"} Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.343503 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"5552b00d111796f14c23ae1574a0dace44237216d751955457f6280a6b953ea6"} Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.447229 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlxgq-config-cp55m"] Dec 05 20:31:35 crc kubenswrapper[4904]: I1205 20:31:35.696029 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23d7afd-986c-4835-93d5-f19f06624f66" path="/var/lib/kubelet/pods/f23d7afd-986c-4835-93d5-f19f06624f66/volumes" Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.363856 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"177b15f3968d598b6a0da92017e3c2a800e6eb567bd886d18d4395b0fd20f392"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.364215 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"ef7a769e515aa7677bd4546890a95da7e4840d524e7dfda7bc10d879e64c18c5"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.364229 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"5c1a5ab9205df1d160c02f3229a9a6fb63b1926503feb0009c21b61754027940"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.364240 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"ec597ba80294b6db32d9f747083ce77b4a41b708b02603e048f230f41112a4cc"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.367199 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ad24986-23a3-4010-8dcf-6778339691c8","Type":"ContainerStarted","Data":"aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.368512 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.371987 4904 generic.go:334] "Generic (PLEG): container finished" podID="fcda485e-18b2-4536-a019-8922707d40dc" containerID="cd624dd8dedee8ad675e84e8bb0260b665f03b36bb5f85caaf69b04d1a8bffcf" exitCode=0 Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.372203 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq-config-cp55m" event={"ID":"fcda485e-18b2-4536-a019-8922707d40dc","Type":"ContainerDied","Data":"cd624dd8dedee8ad675e84e8bb0260b665f03b36bb5f85caaf69b04d1a8bffcf"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.372259 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq-config-cp55m" event={"ID":"fcda485e-18b2-4536-a019-8922707d40dc","Type":"ContainerStarted","Data":"e8e899b7af756980deb6da0eacde9ea4ccf56a2a4b19cffbaff89b5f5c667456"} Dec 05 20:31:36 crc kubenswrapper[4904]: I1205 20:31:36.429054 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371949.425745 podStartE2EDuration="1m27.42903127s" podCreationTimestamp="2025-12-05 20:30:09 +0000 UTC" firstStartedPulling="2025-12-05 20:30:11.633787079 +0000 UTC m=+1110.445003188" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:36.420292927 +0000 UTC m=+1195.231509066" watchObservedRunningTime="2025-12-05 20:31:36.42903127 +0000 UTC m=+1195.240247389" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.388005 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"85338f349709c4c06b5ea9a21550c95c9426ef4e37364f4969c94cf1348c4fb5"} Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.388890 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be5ed3b2-bc48-4865-ade5-f7c2e379a1ea","Type":"ContainerStarted","Data":"187a1a6bf7c045dca1577d28e804e192402272eaf3fe01dc5837ca1cd5fbd685"} Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.430477 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.267168946 podStartE2EDuration="30.430457869s" podCreationTimestamp="2025-12-05 20:31:07 +0000 UTC" firstStartedPulling="2025-12-05 20:31:24.792322941 +0000 UTC m=+1183.603539060" lastFinishedPulling="2025-12-05 20:31:34.955611874 +0000 UTC m=+1193.766827983" observedRunningTime="2025-12-05 20:31:37.423626689 +0000 UTC m=+1196.234842808" watchObservedRunningTime="2025-12-05 20:31:37.430457869 +0000 UTC m=+1196.241673978" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.725893 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54498d5797-97l4v"] Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.727892 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.737706 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54498d5797-97l4v"] Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.738534 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.768730 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.809210 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-config\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.809295 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-svc\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.809373 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-nb\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.809407 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5kgf\" (UniqueName: \"kubernetes.io/projected/90af75ae-3389-4c14-a59d-eaaeee34e58f-kube-api-access-q5kgf\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.809492 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-swift-storage-0\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.809513 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-sb\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.910912 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-log-ovn\") pod \"fcda485e-18b2-4536-a019-8922707d40dc\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.910988 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-additional-scripts\") pod \"fcda485e-18b2-4536-a019-8922707d40dc\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911026 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run-ovn\") pod \"fcda485e-18b2-4536-a019-8922707d40dc\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911091 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run\") pod \"fcda485e-18b2-4536-a019-8922707d40dc\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911116 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5pln\" (UniqueName: \"kubernetes.io/projected/fcda485e-18b2-4536-a019-8922707d40dc-kube-api-access-z5pln\") pod \"fcda485e-18b2-4536-a019-8922707d40dc\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911136 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-scripts\") pod \"fcda485e-18b2-4536-a019-8922707d40dc\" (UID: \"fcda485e-18b2-4536-a019-8922707d40dc\") " Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911120 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fcda485e-18b2-4536-a019-8922707d40dc" (UID: "fcda485e-18b2-4536-a019-8922707d40dc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911219 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run" (OuterVolumeSpecName: "var-run") pod "fcda485e-18b2-4536-a019-8922707d40dc" (UID: "fcda485e-18b2-4536-a019-8922707d40dc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911253 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fcda485e-18b2-4536-a019-8922707d40dc" (UID: "fcda485e-18b2-4536-a019-8922707d40dc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911410 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-svc\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911519 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-nb\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911556 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5kgf\" (UniqueName: \"kubernetes.io/projected/90af75ae-3389-4c14-a59d-eaaeee34e58f-kube-api-access-q5kgf\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911610 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-sb\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911633 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-swift-storage-0\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911682 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-config\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911752 4904 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911770 4904 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911779 4904 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fcda485e-18b2-4536-a019-8922707d40dc-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.911954 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fcda485e-18b2-4536-a019-8922707d40dc" (UID: "fcda485e-18b2-4536-a019-8922707d40dc"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.912698 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-config\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.913389 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-scripts" (OuterVolumeSpecName: "scripts") pod "fcda485e-18b2-4536-a019-8922707d40dc" (UID: "fcda485e-18b2-4536-a019-8922707d40dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.913588 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-sb\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.913653 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-svc\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.913748 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-swift-storage-0\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.913765 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-nb\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.919944 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcda485e-18b2-4536-a019-8922707d40dc-kube-api-access-z5pln" (OuterVolumeSpecName: "kube-api-access-z5pln") pod "fcda485e-18b2-4536-a019-8922707d40dc" (UID: "fcda485e-18b2-4536-a019-8922707d40dc"). InnerVolumeSpecName "kube-api-access-z5pln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:37 crc kubenswrapper[4904]: I1205 20:31:37.933626 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5kgf\" (UniqueName: \"kubernetes.io/projected/90af75ae-3389-4c14-a59d-eaaeee34e58f-kube-api-access-q5kgf\") pod \"dnsmasq-dns-54498d5797-97l4v\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.013439 4904 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.013490 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5pln\" (UniqueName: \"kubernetes.io/projected/fcda485e-18b2-4536-a019-8922707d40dc-kube-api-access-z5pln\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.013506 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcda485e-18b2-4536-a019-8922707d40dc-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.080639 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.397315 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlxgq-config-cp55m" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.397648 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlxgq-config-cp55m" event={"ID":"fcda485e-18b2-4536-a019-8922707d40dc","Type":"ContainerDied","Data":"e8e899b7af756980deb6da0eacde9ea4ccf56a2a4b19cffbaff89b5f5c667456"} Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.397682 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e899b7af756980deb6da0eacde9ea4ccf56a2a4b19cffbaff89b5f5c667456" Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.581877 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54498d5797-97l4v"] Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.838421 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hlxgq-config-cp55m"] Dec 05 20:31:38 crc kubenswrapper[4904]: I1205 20:31:38.846008 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hlxgq-config-cp55m"] Dec 05 20:31:39 crc kubenswrapper[4904]: I1205 20:31:39.406928 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54498d5797-97l4v" event={"ID":"90af75ae-3389-4c14-a59d-eaaeee34e58f","Type":"ContainerStarted","Data":"eca5bb8b29bf1e606bc87c6928a3ad4ac7f410f8c08fe1946df761c22fd6da30"} Dec 05 20:31:39 crc kubenswrapper[4904]: I1205 20:31:39.406981 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54498d5797-97l4v" event={"ID":"90af75ae-3389-4c14-a59d-eaaeee34e58f","Type":"ContainerStarted","Data":"35b85f2fd588b4e9a2793141145513387c22ef9d1c186ed537d131aad8d2aae1"} Dec 05 20:31:39 crc kubenswrapper[4904]: I1205 20:31:39.693245 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcda485e-18b2-4536-a019-8922707d40dc" path="/var/lib/kubelet/pods/fcda485e-18b2-4536-a019-8922707d40dc/volumes" Dec 05 20:31:40 crc kubenswrapper[4904]: I1205 20:31:40.417130 4904 generic.go:334] "Generic (PLEG): container finished" podID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerID="eca5bb8b29bf1e606bc87c6928a3ad4ac7f410f8c08fe1946df761c22fd6da30" exitCode=0 Dec 05 20:31:40 crc kubenswrapper[4904]: I1205 20:31:40.417182 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54498d5797-97l4v" event={"ID":"90af75ae-3389-4c14-a59d-eaaeee34e58f","Type":"ContainerDied","Data":"eca5bb8b29bf1e606bc87c6928a3ad4ac7f410f8c08fe1946df761c22fd6da30"} Dec 05 20:31:41 crc kubenswrapper[4904]: I1205 20:31:41.427012 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54498d5797-97l4v" event={"ID":"90af75ae-3389-4c14-a59d-eaaeee34e58f","Type":"ContainerStarted","Data":"063c158c88ef755b139fba63fbb815a2f18013203e130151b85cfe74fa9a296c"} Dec 05 20:31:41 crc kubenswrapper[4904]: I1205 20:31:41.427832 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:41 crc kubenswrapper[4904]: I1205 20:31:41.449592 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54498d5797-97l4v" podStartSLOduration=4.449552195 podStartE2EDuration="4.449552195s" podCreationTimestamp="2025-12-05 20:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:41.444483585 +0000 UTC m=+1200.255699694" watchObservedRunningTime="2025-12-05 20:31:41.449552195 +0000 UTC m=+1200.260768314" Dec 05 20:31:43 crc kubenswrapper[4904]: I1205 20:31:43.446904 4904 generic.go:334] "Generic (PLEG): container finished" podID="99273c11-759d-4f83-a065-bf008d6a110f" containerID="e3cd937f13d5690f46fe2b55e7bc6ccedc15fe2bd1e6fbaec5c9f90263b1a790" exitCode=0 Dec 05 20:31:43 crc kubenswrapper[4904]: I1205 20:31:43.448280 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerDied","Data":"e3cd937f13d5690f46fe2b55e7bc6ccedc15fe2bd1e6fbaec5c9f90263b1a790"} Dec 05 20:31:44 crc kubenswrapper[4904]: I1205 20:31:44.459717 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerStarted","Data":"62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8"} Dec 05 20:31:47 crc kubenswrapper[4904]: I1205 20:31:47.491106 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerStarted","Data":"57dbf94a2709259dac03b7ed9a3a2a2403a945d2f55b8f46f07fa8bc146febbf"} Dec 05 20:31:47 crc kubenswrapper[4904]: I1205 20:31:47.491813 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerStarted","Data":"7cbc7bacd9571c9f748aea81fd8c39c95848ff4193b76c21641d0a7598e7c4e1"} Dec 05 20:31:47 crc kubenswrapper[4904]: I1205 20:31:47.539286 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.539223589 podStartE2EDuration="23.539223589s" podCreationTimestamp="2025-12-05 20:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:47.522289639 +0000 UTC m=+1206.333505788" watchObservedRunningTime="2025-12-05 20:31:47.539223589 +0000 UTC m=+1206.350439738" Dec 05 20:31:48 crc kubenswrapper[4904]: I1205 20:31:48.082300 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:31:48 crc kubenswrapper[4904]: I1205 20:31:48.153275 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4d549fbf-m45vw"] Dec 05 20:31:48 crc kubenswrapper[4904]: I1205 20:31:48.153540 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerName="dnsmasq-dns" containerID="cri-o://35f4a9689c718072f5071f4ffac0118023242067b5cf212b6fb449cf01fb3a74" gracePeriod=10 Dec 05 20:31:48 crc kubenswrapper[4904]: I1205 20:31:48.504330 4904 generic.go:334] "Generic (PLEG): container finished" podID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerID="35f4a9689c718072f5071f4ffac0118023242067b5cf212b6fb449cf01fb3a74" exitCode=0 Dec 05 20:31:48 crc kubenswrapper[4904]: I1205 20:31:48.504698 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" event={"ID":"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55","Type":"ContainerDied","Data":"35f4a9689c718072f5071f4ffac0118023242067b5cf212b6fb449cf01fb3a74"} Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.092856 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.214920 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-config\") pod \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.214961 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-sb\") pod \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.215713 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-nb\") pod \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.215796 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm79l\" (UniqueName: \"kubernetes.io/projected/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-kube-api-access-vm79l\") pod \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.216136 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-dns-svc\") pod \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\" (UID: \"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55\") " Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.234360 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-kube-api-access-vm79l" (OuterVolumeSpecName: "kube-api-access-vm79l") pod "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" (UID: "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55"). InnerVolumeSpecName "kube-api-access-vm79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.264261 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" (UID: "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.268349 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" (UID: "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.275645 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-config" (OuterVolumeSpecName: "config") pod "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" (UID: "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.277094 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" (UID: "e2a3f1f6-8ec4-4b73-94a1-78f12808fe55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.318290 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.318323 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm79l\" (UniqueName: \"kubernetes.io/projected/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-kube-api-access-vm79l\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.318344 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.318354 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.318362 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.513471 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" event={"ID":"e2a3f1f6-8ec4-4b73-94a1-78f12808fe55","Type":"ContainerDied","Data":"6a318139a8255f79de5c859bc16282894c91042acdd7072586ffb99c61e75470"} Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.513509 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d549fbf-m45vw" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.513551 4904 scope.go:117] "RemoveContainer" containerID="35f4a9689c718072f5071f4ffac0118023242067b5cf212b6fb449cf01fb3a74" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.535618 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.548598 4904 scope.go:117] "RemoveContainer" containerID="f172fe57cf479e24045d628f4b8a29122b29ed742dc96f3af2f99b250799ad36" Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.551380 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4d549fbf-m45vw"] Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.558015 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d4d549fbf-m45vw"] Dec 05 20:31:49 crc kubenswrapper[4904]: I1205 20:31:49.691995 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" path="/var/lib/kubelet/pods/e2a3f1f6-8ec4-4b73-94a1-78f12808fe55/volumes" Dec 05 20:31:50 crc kubenswrapper[4904]: I1205 20:31:50.670377 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Dec 05 20:31:50 crc kubenswrapper[4904]: I1205 20:31:50.982602 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 05 20:31:51 crc kubenswrapper[4904]: I1205 20:31:51.369619 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="0caaad94-d02e-43da-bf3b-087a5ec8d2f8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Dec 05 20:31:54 crc kubenswrapper[4904]: I1205 20:31:54.535024 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:54 crc kubenswrapper[4904]: I1205 20:31:54.542927 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:54 crc kubenswrapper[4904]: I1205 20:31:54.591347 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 20:31:59 crc kubenswrapper[4904]: I1205 20:31:59.956365 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:31:59 crc kubenswrapper[4904]: I1205 20:31:59.958286 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:32:00 crc kubenswrapper[4904]: I1205 20:32:00.670784 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 20:32:00 crc kubenswrapper[4904]: I1205 20:32:00.981276 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.156646 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6wgsf"] Dec 05 20:32:01 crc kubenswrapper[4904]: E1205 20:32:01.156973 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcda485e-18b2-4536-a019-8922707d40dc" containerName="ovn-config" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.156989 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcda485e-18b2-4536-a019-8922707d40dc" containerName="ovn-config" Dec 05 20:32:01 crc kubenswrapper[4904]: E1205 20:32:01.157012 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerName="dnsmasq-dns" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.157020 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerName="dnsmasq-dns" Dec 05 20:32:01 crc kubenswrapper[4904]: E1205 20:32:01.157032 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerName="init" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.157038 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerName="init" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.157215 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcda485e-18b2-4536-a019-8922707d40dc" containerName="ovn-config" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.157229 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a3f1f6-8ec4-4b73-94a1-78f12808fe55" containerName="dnsmasq-dns" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.157837 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.175808 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6wgsf"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.228968 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b02093-4f8b-4c58-be23-1bcda58307c2-operator-scripts\") pod \"barbican-db-create-6wgsf\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.229042 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4s9\" (UniqueName: \"kubernetes.io/projected/14b02093-4f8b-4c58-be23-1bcda58307c2-kube-api-access-cr4s9\") pod \"barbican-db-create-6wgsf\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.250115 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-npdbj"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.251203 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.263357 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7b45-account-create-update-dzcld"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.264487 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.268327 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.270027 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-npdbj"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.316308 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7b45-account-create-update-dzcld"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.331793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c42e4b0-e039-4945-8c9c-fe12766434bd-operator-scripts\") pod \"cinder-db-create-npdbj\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.331860 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b02093-4f8b-4c58-be23-1bcda58307c2-operator-scripts\") pod \"barbican-db-create-6wgsf\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.331908 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4s9\" (UniqueName: \"kubernetes.io/projected/14b02093-4f8b-4c58-be23-1bcda58307c2-kube-api-access-cr4s9\") pod \"barbican-db-create-6wgsf\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.331959 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2h2\" (UniqueName: \"kubernetes.io/projected/0451215c-c2fd-4113-b130-948cde7a8537-kube-api-access-bk2h2\") pod \"barbican-7b45-account-create-update-dzcld\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.332002 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqlvz\" (UniqueName: \"kubernetes.io/projected/5c42e4b0-e039-4945-8c9c-fe12766434bd-kube-api-access-qqlvz\") pod \"cinder-db-create-npdbj\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.332021 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0451215c-c2fd-4113-b130-948cde7a8537-operator-scripts\") pod \"barbican-7b45-account-create-update-dzcld\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.334778 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b02093-4f8b-4c58-be23-1bcda58307c2-operator-scripts\") pod \"barbican-db-create-6wgsf\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.358821 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4s9\" (UniqueName: \"kubernetes.io/projected/14b02093-4f8b-4c58-be23-1bcda58307c2-kube-api-access-cr4s9\") pod \"barbican-db-create-6wgsf\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.364632 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2bc2-account-create-update-f95zs"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.365585 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.369207 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.374954 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.394017 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2bc2-account-create-update-f95zs"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.435350 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfxc\" (UniqueName: \"kubernetes.io/projected/b6e3a4d4-d72b-49db-92c9-214e39f84632-kube-api-access-xqfxc\") pod \"cinder-2bc2-account-create-update-f95zs\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.435422 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2h2\" (UniqueName: \"kubernetes.io/projected/0451215c-c2fd-4113-b130-948cde7a8537-kube-api-access-bk2h2\") pod \"barbican-7b45-account-create-update-dzcld\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.435503 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqlvz\" (UniqueName: \"kubernetes.io/projected/5c42e4b0-e039-4945-8c9c-fe12766434bd-kube-api-access-qqlvz\") pod \"cinder-db-create-npdbj\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.435524 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0451215c-c2fd-4113-b130-948cde7a8537-operator-scripts\") pod \"barbican-7b45-account-create-update-dzcld\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.435559 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c42e4b0-e039-4945-8c9c-fe12766434bd-operator-scripts\") pod \"cinder-db-create-npdbj\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.435633 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e3a4d4-d72b-49db-92c9-214e39f84632-operator-scripts\") pod \"cinder-2bc2-account-create-update-f95zs\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.436290 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0451215c-c2fd-4113-b130-948cde7a8537-operator-scripts\") pod \"barbican-7b45-account-create-update-dzcld\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.436631 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c42e4b0-e039-4945-8c9c-fe12766434bd-operator-scripts\") pod \"cinder-db-create-npdbj\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.457864 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2h2\" (UniqueName: \"kubernetes.io/projected/0451215c-c2fd-4113-b130-948cde7a8537-kube-api-access-bk2h2\") pod \"barbican-7b45-account-create-update-dzcld\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.465560 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqlvz\" (UniqueName: \"kubernetes.io/projected/5c42e4b0-e039-4945-8c9c-fe12766434bd-kube-api-access-qqlvz\") pod \"cinder-db-create-npdbj\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.479698 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.536527 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfxc\" (UniqueName: \"kubernetes.io/projected/b6e3a4d4-d72b-49db-92c9-214e39f84632-kube-api-access-xqfxc\") pod \"cinder-2bc2-account-create-update-f95zs\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.536682 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e3a4d4-d72b-49db-92c9-214e39f84632-operator-scripts\") pod \"cinder-2bc2-account-create-update-f95zs\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.537385 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e3a4d4-d72b-49db-92c9-214e39f84632-operator-scripts\") pod \"cinder-2bc2-account-create-update-f95zs\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.565470 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfxc\" (UniqueName: \"kubernetes.io/projected/b6e3a4d4-d72b-49db-92c9-214e39f84632-kube-api-access-xqfxc\") pod \"cinder-2bc2-account-create-update-f95zs\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.569652 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.583992 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.704804 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6474s"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.705800 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.707899 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.708232 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.708433 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.710624 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ddf7h" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.714467 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6474s"] Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.726117 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.739767 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-combined-ca-bundle\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.739952 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwc8p\" (UniqueName: \"kubernetes.io/projected/e323a682-f130-41c5-b97b-b7cd6ab4aecf-kube-api-access-lwc8p\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.739999 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-config-data\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.842189 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-combined-ca-bundle\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.842364 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwc8p\" (UniqueName: \"kubernetes.io/projected/e323a682-f130-41c5-b97b-b7cd6ab4aecf-kube-api-access-lwc8p\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.842405 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-config-data\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.848520 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-combined-ca-bundle\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.850668 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-config-data\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:01 crc kubenswrapper[4904]: I1205 20:32:01.871276 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwc8p\" (UniqueName: \"kubernetes.io/projected/e323a682-f130-41c5-b97b-b7cd6ab4aecf-kube-api-access-lwc8p\") pod \"keystone-db-sync-6474s\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.037563 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.088164 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6wgsf"] Dec 05 20:32:02 crc kubenswrapper[4904]: W1205 20:32:02.089276 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b02093_4f8b_4c58_be23_1bcda58307c2.slice/crio-912ee2bec0bf5b8a00232be9051c5eddda1011ff097e3521a089d93380855b45 WatchSource:0}: Error finding container 912ee2bec0bf5b8a00232be9051c5eddda1011ff097e3521a089d93380855b45: Status 404 returned error can't find the container with id 912ee2bec0bf5b8a00232be9051c5eddda1011ff097e3521a089d93380855b45 Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.156674 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-npdbj"] Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.227360 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7b45-account-create-update-dzcld"] Dec 05 20:32:02 crc kubenswrapper[4904]: W1205 20:32:02.243323 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0451215c_c2fd_4113_b130_948cde7a8537.slice/crio-2b93a90f285e7dcf157e5dfc804d81af6ed24a1c6acf47b7ca8c74c6a6135f2c WatchSource:0}: Error finding container 2b93a90f285e7dcf157e5dfc804d81af6ed24a1c6acf47b7ca8c74c6a6135f2c: Status 404 returned error can't find the container with id 2b93a90f285e7dcf157e5dfc804d81af6ed24a1c6acf47b7ca8c74c6a6135f2c Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.307948 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2bc2-account-create-update-f95zs"] Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.340197 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6474s"] Dec 05 20:32:02 crc kubenswrapper[4904]: W1205 20:32:02.365725 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode323a682_f130_41c5_b97b_b7cd6ab4aecf.slice/crio-1cc188c9ab604db095fcf8b868c5a2a56bbfb2edbc6b69a50141208b82bdf72b WatchSource:0}: Error finding container 1cc188c9ab604db095fcf8b868c5a2a56bbfb2edbc6b69a50141208b82bdf72b: Status 404 returned error can't find the container with id 1cc188c9ab604db095fcf8b868c5a2a56bbfb2edbc6b69a50141208b82bdf72b Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.687704 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bc2-account-create-update-f95zs" event={"ID":"b6e3a4d4-d72b-49db-92c9-214e39f84632","Type":"ContainerStarted","Data":"34be13a802bce6e27c100c3085b8c02a27e009f51795636b75fdc8a933e5113e"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.688045 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bc2-account-create-update-f95zs" event={"ID":"b6e3a4d4-d72b-49db-92c9-214e39f84632","Type":"ContainerStarted","Data":"c0e7ee832acc592c870d277fd865f3fca891cd25b1d2f33973678fc6ee35dded"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.691816 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7b45-account-create-update-dzcld" event={"ID":"0451215c-c2fd-4113-b130-948cde7a8537","Type":"ContainerStarted","Data":"4ba2d394450df831d2dfaf242c743a8a1c320f6800383242e9a5c000be4e77a1"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.691876 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7b45-account-create-update-dzcld" event={"ID":"0451215c-c2fd-4113-b130-948cde7a8537","Type":"ContainerStarted","Data":"2b93a90f285e7dcf157e5dfc804d81af6ed24a1c6acf47b7ca8c74c6a6135f2c"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.696444 4904 generic.go:334] "Generic (PLEG): container finished" podID="14b02093-4f8b-4c58-be23-1bcda58307c2" containerID="c58b4b306172ccd251ea711a6b0ed7c9e1e33fd1fe64512a2b185fae9f1733f4" exitCode=0 Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.696529 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6wgsf" event={"ID":"14b02093-4f8b-4c58-be23-1bcda58307c2","Type":"ContainerDied","Data":"c58b4b306172ccd251ea711a6b0ed7c9e1e33fd1fe64512a2b185fae9f1733f4"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.696562 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6wgsf" event={"ID":"14b02093-4f8b-4c58-be23-1bcda58307c2","Type":"ContainerStarted","Data":"912ee2bec0bf5b8a00232be9051c5eddda1011ff097e3521a089d93380855b45"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.698778 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6474s" event={"ID":"e323a682-f130-41c5-b97b-b7cd6ab4aecf","Type":"ContainerStarted","Data":"1cc188c9ab604db095fcf8b868c5a2a56bbfb2edbc6b69a50141208b82bdf72b"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.706472 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-npdbj" event={"ID":"5c42e4b0-e039-4945-8c9c-fe12766434bd","Type":"ContainerStarted","Data":"18b66a1404cd78a9d1ffb1091d77c00d4e63a4385bf007a23b25c8661cf9e341"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.706530 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-npdbj" event={"ID":"5c42e4b0-e039-4945-8c9c-fe12766434bd","Type":"ContainerStarted","Data":"03e1f8c5dc1d8e8bf715f92ac3d88d1b6b99ce20d02e145e8d82163027d2e90c"} Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.708756 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2bc2-account-create-update-f95zs" podStartSLOduration=1.7087383809999999 podStartE2EDuration="1.708738381s" podCreationTimestamp="2025-12-05 20:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:02.702730404 +0000 UTC m=+1221.513946533" watchObservedRunningTime="2025-12-05 20:32:02.708738381 +0000 UTC m=+1221.519954490" Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.723105 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-7b45-account-create-update-dzcld" podStartSLOduration=1.72308168 podStartE2EDuration="1.72308168s" podCreationTimestamp="2025-12-05 20:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:02.719196162 +0000 UTC m=+1221.530412281" watchObservedRunningTime="2025-12-05 20:32:02.72308168 +0000 UTC m=+1221.534297789" Dec 05 20:32:02 crc kubenswrapper[4904]: I1205 20:32:02.764498 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-npdbj" podStartSLOduration=1.7644823889999999 podStartE2EDuration="1.764482389s" podCreationTimestamp="2025-12-05 20:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:02.761881547 +0000 UTC m=+1221.573097666" watchObservedRunningTime="2025-12-05 20:32:02.764482389 +0000 UTC m=+1221.575698498" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.712723 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-dkcsq"] Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.713797 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.716847 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.717119 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-w4g9x" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.722313 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-dkcsq"] Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.736221 4904 generic.go:334] "Generic (PLEG): container finished" podID="0451215c-c2fd-4113-b130-948cde7a8537" containerID="4ba2d394450df831d2dfaf242c743a8a1c320f6800383242e9a5c000be4e77a1" exitCode=0 Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.736280 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7b45-account-create-update-dzcld" event={"ID":"0451215c-c2fd-4113-b130-948cde7a8537","Type":"ContainerDied","Data":"4ba2d394450df831d2dfaf242c743a8a1c320f6800383242e9a5c000be4e77a1"} Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.739266 4904 generic.go:334] "Generic (PLEG): container finished" podID="5c42e4b0-e039-4945-8c9c-fe12766434bd" containerID="18b66a1404cd78a9d1ffb1091d77c00d4e63a4385bf007a23b25c8661cf9e341" exitCode=0 Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.739321 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-npdbj" event={"ID":"5c42e4b0-e039-4945-8c9c-fe12766434bd","Type":"ContainerDied","Data":"18b66a1404cd78a9d1ffb1091d77c00d4e63a4385bf007a23b25c8661cf9e341"} Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.741020 4904 generic.go:334] "Generic (PLEG): container finished" podID="b6e3a4d4-d72b-49db-92c9-214e39f84632" containerID="34be13a802bce6e27c100c3085b8c02a27e009f51795636b75fdc8a933e5113e" exitCode=0 Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.741232 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bc2-account-create-update-f95zs" event={"ID":"b6e3a4d4-d72b-49db-92c9-214e39f84632","Type":"ContainerDied","Data":"34be13a802bce6e27c100c3085b8c02a27e009f51795636b75fdc8a933e5113e"} Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.783383 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-combined-ca-bundle\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.783441 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwt5l\" (UniqueName: \"kubernetes.io/projected/73c8de0c-093e-49ca-b80a-cb990b546a50-kube-api-access-zwt5l\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.783460 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-db-sync-config-data\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.783480 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-config-data\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.806005 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hgvtd"] Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.807019 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.836784 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hgvtd"] Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.885760 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78120578-1e01-44dd-b0ed-908e6d0103df-operator-scripts\") pod \"glance-db-create-hgvtd\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.885856 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-combined-ca-bundle\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.885879 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzdv\" (UniqueName: \"kubernetes.io/projected/78120578-1e01-44dd-b0ed-908e6d0103df-kube-api-access-dvzdv\") pod \"glance-db-create-hgvtd\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.885918 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwt5l\" (UniqueName: \"kubernetes.io/projected/73c8de0c-093e-49ca-b80a-cb990b546a50-kube-api-access-zwt5l\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.885960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-db-sync-config-data\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.885988 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-config-data\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.887252 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9154-account-create-update-cmzk9"] Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.888698 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.895604 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.896041 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-db-sync-config-data\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.897370 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-config-data\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.919031 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-combined-ca-bundle\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.944637 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwt5l\" (UniqueName: \"kubernetes.io/projected/73c8de0c-093e-49ca-b80a-cb990b546a50-kube-api-access-zwt5l\") pod \"watcher-db-sync-dkcsq\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.949272 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9154-account-create-update-cmzk9"] Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.999354 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc402d2-1cfe-4db2-a189-3f376e530162-operator-scripts\") pod \"glance-9154-account-create-update-cmzk9\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.999412 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pfsl\" (UniqueName: \"kubernetes.io/projected/fcc402d2-1cfe-4db2-a189-3f376e530162-kube-api-access-8pfsl\") pod \"glance-9154-account-create-update-cmzk9\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.999477 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78120578-1e01-44dd-b0ed-908e6d0103df-operator-scripts\") pod \"glance-db-create-hgvtd\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:03 crc kubenswrapper[4904]: I1205 20:32:03.999556 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzdv\" (UniqueName: \"kubernetes.io/projected/78120578-1e01-44dd-b0ed-908e6d0103df-kube-api-access-dvzdv\") pod \"glance-db-create-hgvtd\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.001612 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78120578-1e01-44dd-b0ed-908e6d0103df-operator-scripts\") pod \"glance-db-create-hgvtd\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.022995 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-g6b9b"] Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.025601 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.035412 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g6b9b"] Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.043575 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzdv\" (UniqueName: \"kubernetes.io/projected/78120578-1e01-44dd-b0ed-908e6d0103df-kube-api-access-dvzdv\") pod \"glance-db-create-hgvtd\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.089346 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.100739 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc402d2-1cfe-4db2-a189-3f376e530162-operator-scripts\") pod \"glance-9154-account-create-update-cmzk9\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.100781 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pfsl\" (UniqueName: \"kubernetes.io/projected/fcc402d2-1cfe-4db2-a189-3f376e530162-kube-api-access-8pfsl\") pod \"glance-9154-account-create-update-cmzk9\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.100829 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dcc2-account-create-update-6b8b8"] Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.100843 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnpjv\" (UniqueName: \"kubernetes.io/projected/f23caf25-58e4-47ac-aa1f-a49e5df687c2-kube-api-access-lnpjv\") pod \"neutron-db-create-g6b9b\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.101358 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f23caf25-58e4-47ac-aa1f-a49e5df687c2-operator-scripts\") pod \"neutron-db-create-g6b9b\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.101497 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc402d2-1cfe-4db2-a189-3f376e530162-operator-scripts\") pod \"glance-9154-account-create-update-cmzk9\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.102766 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.105338 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.119137 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pfsl\" (UniqueName: \"kubernetes.io/projected/fcc402d2-1cfe-4db2-a189-3f376e530162-kube-api-access-8pfsl\") pod \"glance-9154-account-create-update-cmzk9\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.121844 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dcc2-account-create-update-6b8b8"] Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.142861 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.203169 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfffc\" (UniqueName: \"kubernetes.io/projected/d40e91b3-9613-4d07-9830-5ec0279bfae3-kube-api-access-jfffc\") pod \"neutron-dcc2-account-create-update-6b8b8\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.203338 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnpjv\" (UniqueName: \"kubernetes.io/projected/f23caf25-58e4-47ac-aa1f-a49e5df687c2-kube-api-access-lnpjv\") pod \"neutron-db-create-g6b9b\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.203439 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f23caf25-58e4-47ac-aa1f-a49e5df687c2-operator-scripts\") pod \"neutron-db-create-g6b9b\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.203585 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d40e91b3-9613-4d07-9830-5ec0279bfae3-operator-scripts\") pod \"neutron-dcc2-account-create-update-6b8b8\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.204275 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f23caf25-58e4-47ac-aa1f-a49e5df687c2-operator-scripts\") pod \"neutron-db-create-g6b9b\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.224804 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.224857 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnpjv\" (UniqueName: \"kubernetes.io/projected/f23caf25-58e4-47ac-aa1f-a49e5df687c2-kube-api-access-lnpjv\") pod \"neutron-db-create-g6b9b\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.304589 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4s9\" (UniqueName: \"kubernetes.io/projected/14b02093-4f8b-4c58-be23-1bcda58307c2-kube-api-access-cr4s9\") pod \"14b02093-4f8b-4c58-be23-1bcda58307c2\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.304789 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b02093-4f8b-4c58-be23-1bcda58307c2-operator-scripts\") pod \"14b02093-4f8b-4c58-be23-1bcda58307c2\" (UID: \"14b02093-4f8b-4c58-be23-1bcda58307c2\") " Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.305318 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d40e91b3-9613-4d07-9830-5ec0279bfae3-operator-scripts\") pod \"neutron-dcc2-account-create-update-6b8b8\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.305374 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfffc\" (UniqueName: \"kubernetes.io/projected/d40e91b3-9613-4d07-9830-5ec0279bfae3-kube-api-access-jfffc\") pod \"neutron-dcc2-account-create-update-6b8b8\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.306483 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d40e91b3-9613-4d07-9830-5ec0279bfae3-operator-scripts\") pod \"neutron-dcc2-account-create-update-6b8b8\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.307017 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b02093-4f8b-4c58-be23-1bcda58307c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14b02093-4f8b-4c58-be23-1bcda58307c2" (UID: "14b02093-4f8b-4c58-be23-1bcda58307c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.318747 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b02093-4f8b-4c58-be23-1bcda58307c2-kube-api-access-cr4s9" (OuterVolumeSpecName: "kube-api-access-cr4s9") pod "14b02093-4f8b-4c58-be23-1bcda58307c2" (UID: "14b02093-4f8b-4c58-be23-1bcda58307c2"). InnerVolumeSpecName "kube-api-access-cr4s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.323296 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.332537 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfffc\" (UniqueName: \"kubernetes.io/projected/d40e91b3-9613-4d07-9830-5ec0279bfae3-kube-api-access-jfffc\") pod \"neutron-dcc2-account-create-update-6b8b8\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.372464 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.407484 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4s9\" (UniqueName: \"kubernetes.io/projected/14b02093-4f8b-4c58-be23-1bcda58307c2-kube-api-access-cr4s9\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.407523 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b02093-4f8b-4c58-be23-1bcda58307c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.513920 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.593783 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-dkcsq"] Dec 05 20:32:04 crc kubenswrapper[4904]: W1205 20:32:04.619080 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c8de0c_093e_49ca_b80a_cb990b546a50.slice/crio-3ae8a4de4e1bafde3b3ad5a614e35cab9537a11d3c33149b7a69ff675e88e1bb WatchSource:0}: Error finding container 3ae8a4de4e1bafde3b3ad5a614e35cab9537a11d3c33149b7a69ff675e88e1bb: Status 404 returned error can't find the container with id 3ae8a4de4e1bafde3b3ad5a614e35cab9537a11d3c33149b7a69ff675e88e1bb Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.686646 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hgvtd"] Dec 05 20:32:04 crc kubenswrapper[4904]: W1205 20:32:04.689280 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78120578_1e01_44dd_b0ed_908e6d0103df.slice/crio-6cfadcaa5058d957574fafe01c4dfabe440a3ba52416c4fbd02c2b69ae42cbcb WatchSource:0}: Error finding container 6cfadcaa5058d957574fafe01c4dfabe440a3ba52416c4fbd02c2b69ae42cbcb: Status 404 returned error can't find the container with id 6cfadcaa5058d957574fafe01c4dfabe440a3ba52416c4fbd02c2b69ae42cbcb Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.764634 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6wgsf" event={"ID":"14b02093-4f8b-4c58-be23-1bcda58307c2","Type":"ContainerDied","Data":"912ee2bec0bf5b8a00232be9051c5eddda1011ff097e3521a089d93380855b45"} Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.764683 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912ee2bec0bf5b8a00232be9051c5eddda1011ff097e3521a089d93380855b45" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.764756 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wgsf" Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.773152 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-dkcsq" event={"ID":"73c8de0c-093e-49ca-b80a-cb990b546a50","Type":"ContainerStarted","Data":"3ae8a4de4e1bafde3b3ad5a614e35cab9537a11d3c33149b7a69ff675e88e1bb"} Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.782003 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hgvtd" event={"ID":"78120578-1e01-44dd-b0ed-908e6d0103df","Type":"ContainerStarted","Data":"6cfadcaa5058d957574fafe01c4dfabe440a3ba52416c4fbd02c2b69ae42cbcb"} Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.808637 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9154-account-create-update-cmzk9"] Dec 05 20:32:04 crc kubenswrapper[4904]: I1205 20:32:04.912320 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g6b9b"] Dec 05 20:32:05 crc kubenswrapper[4904]: I1205 20:32:05.045319 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dcc2-account-create-update-6b8b8"] Dec 05 20:32:05 crc kubenswrapper[4904]: I1205 20:32:05.792591 4904 generic.go:334] "Generic (PLEG): container finished" podID="78120578-1e01-44dd-b0ed-908e6d0103df" containerID="14bf7a210112021292eda1a3b7e95d9a478e35bfab32dcb8b7d7695ebdfb3e88" exitCode=0 Dec 05 20:32:05 crc kubenswrapper[4904]: I1205 20:32:05.792676 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hgvtd" event={"ID":"78120578-1e01-44dd-b0ed-908e6d0103df","Type":"ContainerDied","Data":"14bf7a210112021292eda1a3b7e95d9a478e35bfab32dcb8b7d7695ebdfb3e88"} Dec 05 20:32:08 crc kubenswrapper[4904]: W1205 20:32:08.747360 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23caf25_58e4_47ac_aa1f_a49e5df687c2.slice/crio-ce74bdcc6559286a7cdb3e8f650fb8354ded0a97b3ea5366016f497aa9004b8f WatchSource:0}: Error finding container ce74bdcc6559286a7cdb3e8f650fb8354ded0a97b3ea5366016f497aa9004b8f: Status 404 returned error can't find the container with id ce74bdcc6559286a7cdb3e8f650fb8354ded0a97b3ea5366016f497aa9004b8f Dec 05 20:32:08 crc kubenswrapper[4904]: I1205 20:32:08.819706 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6b9b" event={"ID":"f23caf25-58e4-47ac-aa1f-a49e5df687c2","Type":"ContainerStarted","Data":"ce74bdcc6559286a7cdb3e8f650fb8354ded0a97b3ea5366016f497aa9004b8f"} Dec 05 20:32:11 crc kubenswrapper[4904]: W1205 20:32:11.808565 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd40e91b3_9613_4d07_9830_5ec0279bfae3.slice/crio-ed6fbe0ae6c7a79e97f75e0ea1085eea2e57d5ceb8309168892f9a161796f3d9 WatchSource:0}: Error finding container ed6fbe0ae6c7a79e97f75e0ea1085eea2e57d5ceb8309168892f9a161796f3d9: Status 404 returned error can't find the container with id ed6fbe0ae6c7a79e97f75e0ea1085eea2e57d5ceb8309168892f9a161796f3d9 Dec 05 20:32:11 crc kubenswrapper[4904]: W1205 20:32:11.810072 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcc402d2_1cfe_4db2_a189_3f376e530162.slice/crio-e77f578396babafd445aeb76e7f8f8472567c9c6f16f4cafaf93c94c5adf221e WatchSource:0}: Error finding container e77f578396babafd445aeb76e7f8f8472567c9c6f16f4cafaf93c94c5adf221e: Status 404 returned error can't find the container with id e77f578396babafd445aeb76e7f8f8472567c9c6f16f4cafaf93c94c5adf221e Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.849526 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9154-account-create-update-cmzk9" event={"ID":"fcc402d2-1cfe-4db2-a189-3f376e530162","Type":"ContainerStarted","Data":"e77f578396babafd445aeb76e7f8f8472567c9c6f16f4cafaf93c94c5adf221e"} Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.852838 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dcc2-account-create-update-6b8b8" event={"ID":"d40e91b3-9613-4d07-9830-5ec0279bfae3","Type":"ContainerStarted","Data":"ed6fbe0ae6c7a79e97f75e0ea1085eea2e57d5ceb8309168892f9a161796f3d9"} Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.855228 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bc2-account-create-update-f95zs" event={"ID":"b6e3a4d4-d72b-49db-92c9-214e39f84632","Type":"ContainerDied","Data":"c0e7ee832acc592c870d277fd865f3fca891cd25b1d2f33973678fc6ee35dded"} Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.855272 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0e7ee832acc592c870d277fd865f3fca891cd25b1d2f33973678fc6ee35dded" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.857670 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7b45-account-create-update-dzcld" event={"ID":"0451215c-c2fd-4113-b130-948cde7a8537","Type":"ContainerDied","Data":"2b93a90f285e7dcf157e5dfc804d81af6ed24a1c6acf47b7ca8c74c6a6135f2c"} Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.857713 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b93a90f285e7dcf157e5dfc804d81af6ed24a1c6acf47b7ca8c74c6a6135f2c" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.859516 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hgvtd" event={"ID":"78120578-1e01-44dd-b0ed-908e6d0103df","Type":"ContainerDied","Data":"6cfadcaa5058d957574fafe01c4dfabe440a3ba52416c4fbd02c2b69ae42cbcb"} Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.859542 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cfadcaa5058d957574fafe01c4dfabe440a3ba52416c4fbd02c2b69ae42cbcb" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.861343 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-npdbj" event={"ID":"5c42e4b0-e039-4945-8c9c-fe12766434bd","Type":"ContainerDied","Data":"03e1f8c5dc1d8e8bf715f92ac3d88d1b6b99ce20d02e145e8d82163027d2e90c"} Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.861377 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e1f8c5dc1d8e8bf715f92ac3d88d1b6b99ce20d02e145e8d82163027d2e90c" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.954867 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.961619 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.977637 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:11 crc kubenswrapper[4904]: I1205 20:32:11.986090 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.059966 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78120578-1e01-44dd-b0ed-908e6d0103df-operator-scripts\") pod \"78120578-1e01-44dd-b0ed-908e6d0103df\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060009 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqfxc\" (UniqueName: \"kubernetes.io/projected/b6e3a4d4-d72b-49db-92c9-214e39f84632-kube-api-access-xqfxc\") pod \"b6e3a4d4-d72b-49db-92c9-214e39f84632\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060029 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e3a4d4-d72b-49db-92c9-214e39f84632-operator-scripts\") pod \"b6e3a4d4-d72b-49db-92c9-214e39f84632\" (UID: \"b6e3a4d4-d72b-49db-92c9-214e39f84632\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060046 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c42e4b0-e039-4945-8c9c-fe12766434bd-operator-scripts\") pod \"5c42e4b0-e039-4945-8c9c-fe12766434bd\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060106 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqlvz\" (UniqueName: \"kubernetes.io/projected/5c42e4b0-e039-4945-8c9c-fe12766434bd-kube-api-access-qqlvz\") pod \"5c42e4b0-e039-4945-8c9c-fe12766434bd\" (UID: \"5c42e4b0-e039-4945-8c9c-fe12766434bd\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060135 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvzdv\" (UniqueName: \"kubernetes.io/projected/78120578-1e01-44dd-b0ed-908e6d0103df-kube-api-access-dvzdv\") pod \"78120578-1e01-44dd-b0ed-908e6d0103df\" (UID: \"78120578-1e01-44dd-b0ed-908e6d0103df\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060155 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk2h2\" (UniqueName: \"kubernetes.io/projected/0451215c-c2fd-4113-b130-948cde7a8537-kube-api-access-bk2h2\") pod \"0451215c-c2fd-4113-b130-948cde7a8537\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060179 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0451215c-c2fd-4113-b130-948cde7a8537-operator-scripts\") pod \"0451215c-c2fd-4113-b130-948cde7a8537\" (UID: \"0451215c-c2fd-4113-b130-948cde7a8537\") " Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060847 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0451215c-c2fd-4113-b130-948cde7a8537-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0451215c-c2fd-4113-b130-948cde7a8537" (UID: "0451215c-c2fd-4113-b130-948cde7a8537"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.060847 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c42e4b0-e039-4945-8c9c-fe12766434bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c42e4b0-e039-4945-8c9c-fe12766434bd" (UID: "5c42e4b0-e039-4945-8c9c-fe12766434bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.061308 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e3a4d4-d72b-49db-92c9-214e39f84632-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6e3a4d4-d72b-49db-92c9-214e39f84632" (UID: "b6e3a4d4-d72b-49db-92c9-214e39f84632"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.061803 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78120578-1e01-44dd-b0ed-908e6d0103df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78120578-1e01-44dd-b0ed-908e6d0103df" (UID: "78120578-1e01-44dd-b0ed-908e6d0103df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.066229 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0451215c-c2fd-4113-b130-948cde7a8537-kube-api-access-bk2h2" (OuterVolumeSpecName: "kube-api-access-bk2h2") pod "0451215c-c2fd-4113-b130-948cde7a8537" (UID: "0451215c-c2fd-4113-b130-948cde7a8537"). InnerVolumeSpecName "kube-api-access-bk2h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.066550 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e3a4d4-d72b-49db-92c9-214e39f84632-kube-api-access-xqfxc" (OuterVolumeSpecName: "kube-api-access-xqfxc") pod "b6e3a4d4-d72b-49db-92c9-214e39f84632" (UID: "b6e3a4d4-d72b-49db-92c9-214e39f84632"). InnerVolumeSpecName "kube-api-access-xqfxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.068708 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c42e4b0-e039-4945-8c9c-fe12766434bd-kube-api-access-qqlvz" (OuterVolumeSpecName: "kube-api-access-qqlvz") pod "5c42e4b0-e039-4945-8c9c-fe12766434bd" (UID: "5c42e4b0-e039-4945-8c9c-fe12766434bd"). InnerVolumeSpecName "kube-api-access-qqlvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.068797 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78120578-1e01-44dd-b0ed-908e6d0103df-kube-api-access-dvzdv" (OuterVolumeSpecName: "kube-api-access-dvzdv") pod "78120578-1e01-44dd-b0ed-908e6d0103df" (UID: "78120578-1e01-44dd-b0ed-908e6d0103df"). InnerVolumeSpecName "kube-api-access-dvzdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162451 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqfxc\" (UniqueName: \"kubernetes.io/projected/b6e3a4d4-d72b-49db-92c9-214e39f84632-kube-api-access-xqfxc\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162800 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e3a4d4-d72b-49db-92c9-214e39f84632-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162815 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c42e4b0-e039-4945-8c9c-fe12766434bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162826 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqlvz\" (UniqueName: \"kubernetes.io/projected/5c42e4b0-e039-4945-8c9c-fe12766434bd-kube-api-access-qqlvz\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162837 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvzdv\" (UniqueName: \"kubernetes.io/projected/78120578-1e01-44dd-b0ed-908e6d0103df-kube-api-access-dvzdv\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162846 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk2h2\" (UniqueName: \"kubernetes.io/projected/0451215c-c2fd-4113-b130-948cde7a8537-kube-api-access-bk2h2\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162854 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0451215c-c2fd-4113-b130-948cde7a8537-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.162862 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78120578-1e01-44dd-b0ed-908e6d0103df-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.873235 4904 generic.go:334] "Generic (PLEG): container finished" podID="f23caf25-58e4-47ac-aa1f-a49e5df687c2" containerID="bfd86e47d10357a634ac3b5ae1cdb7655dbad01965dfd1a6d945f1b97e4c27e2" exitCode=0 Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.873307 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6b9b" event={"ID":"f23caf25-58e4-47ac-aa1f-a49e5df687c2","Type":"ContainerDied","Data":"bfd86e47d10357a634ac3b5ae1cdb7655dbad01965dfd1a6d945f1b97e4c27e2"} Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.874821 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6474s" event={"ID":"e323a682-f130-41c5-b97b-b7cd6ab4aecf","Type":"ContainerStarted","Data":"3d5bca8cafc334fa0ef45643833874577b34af2e5400587d21e0a38e326bc937"} Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.877764 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-dkcsq" event={"ID":"73c8de0c-093e-49ca-b80a-cb990b546a50","Type":"ContainerStarted","Data":"25a406ebf7dbf4413fbcbc4dbbf6abc7b563f1092458be1a59a7f5bafa31ac27"} Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.878992 4904 generic.go:334] "Generic (PLEG): container finished" podID="fcc402d2-1cfe-4db2-a189-3f376e530162" containerID="5c93fb6698dc26948b69efc50059e9b2b946fec830fbe9895bb1185c5bcc06ba" exitCode=0 Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.879045 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9154-account-create-update-cmzk9" event={"ID":"fcc402d2-1cfe-4db2-a189-3f376e530162","Type":"ContainerDied","Data":"5c93fb6698dc26948b69efc50059e9b2b946fec830fbe9895bb1185c5bcc06ba"} Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.880383 4904 generic.go:334] "Generic (PLEG): container finished" podID="d40e91b3-9613-4d07-9830-5ec0279bfae3" containerID="ebf0c792ade215724a97fd3925d07c9598ca6c0ff5d45f9a735449411bd4811f" exitCode=0 Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.880437 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7b45-account-create-update-dzcld" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.883653 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dcc2-account-create-update-6b8b8" event={"ID":"d40e91b3-9613-4d07-9830-5ec0279bfae3","Type":"ContainerDied","Data":"ebf0c792ade215724a97fd3925d07c9598ca6c0ff5d45f9a735449411bd4811f"} Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.883786 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hgvtd" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.883833 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bc2-account-create-update-f95zs" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.883847 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-npdbj" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.926290 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6474s" podStartSLOduration=1.884193394 podStartE2EDuration="11.926273123s" podCreationTimestamp="2025-12-05 20:32:01 +0000 UTC" firstStartedPulling="2025-12-05 20:32:02.371357373 +0000 UTC m=+1221.182573482" lastFinishedPulling="2025-12-05 20:32:12.413437102 +0000 UTC m=+1231.224653211" observedRunningTime="2025-12-05 20:32:12.918783725 +0000 UTC m=+1231.729999844" watchObservedRunningTime="2025-12-05 20:32:12.926273123 +0000 UTC m=+1231.737489232" Dec 05 20:32:12 crc kubenswrapper[4904]: I1205 20:32:12.969968 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-dkcsq" podStartSLOduration=2.139883862 podStartE2EDuration="9.969949795s" podCreationTimestamp="2025-12-05 20:32:03 +0000 UTC" firstStartedPulling="2025-12-05 20:32:04.621349533 +0000 UTC m=+1223.432565642" lastFinishedPulling="2025-12-05 20:32:12.451415456 +0000 UTC m=+1231.262631575" observedRunningTime="2025-12-05 20:32:12.966928161 +0000 UTC m=+1231.778144280" watchObservedRunningTime="2025-12-05 20:32:12.969949795 +0000 UTC m=+1231.781165914" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.214890 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.314797 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc402d2-1cfe-4db2-a189-3f376e530162-operator-scripts\") pod \"fcc402d2-1cfe-4db2-a189-3f376e530162\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.314871 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pfsl\" (UniqueName: \"kubernetes.io/projected/fcc402d2-1cfe-4db2-a189-3f376e530162-kube-api-access-8pfsl\") pod \"fcc402d2-1cfe-4db2-a189-3f376e530162\" (UID: \"fcc402d2-1cfe-4db2-a189-3f376e530162\") " Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.315261 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc402d2-1cfe-4db2-a189-3f376e530162-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcc402d2-1cfe-4db2-a189-3f376e530162" (UID: "fcc402d2-1cfe-4db2-a189-3f376e530162"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.320188 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc402d2-1cfe-4db2-a189-3f376e530162-kube-api-access-8pfsl" (OuterVolumeSpecName: "kube-api-access-8pfsl") pod "fcc402d2-1cfe-4db2-a189-3f376e530162" (UID: "fcc402d2-1cfe-4db2-a189-3f376e530162"). InnerVolumeSpecName "kube-api-access-8pfsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.375009 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.380238 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.415858 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f23caf25-58e4-47ac-aa1f-a49e5df687c2-operator-scripts\") pod \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.415983 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d40e91b3-9613-4d07-9830-5ec0279bfae3-operator-scripts\") pod \"d40e91b3-9613-4d07-9830-5ec0279bfae3\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.416071 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnpjv\" (UniqueName: \"kubernetes.io/projected/f23caf25-58e4-47ac-aa1f-a49e5df687c2-kube-api-access-lnpjv\") pod \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\" (UID: \"f23caf25-58e4-47ac-aa1f-a49e5df687c2\") " Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.416142 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfffc\" (UniqueName: \"kubernetes.io/projected/d40e91b3-9613-4d07-9830-5ec0279bfae3-kube-api-access-jfffc\") pod \"d40e91b3-9613-4d07-9830-5ec0279bfae3\" (UID: \"d40e91b3-9613-4d07-9830-5ec0279bfae3\") " Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.416559 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc402d2-1cfe-4db2-a189-3f376e530162-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.416578 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pfsl\" (UniqueName: \"kubernetes.io/projected/fcc402d2-1cfe-4db2-a189-3f376e530162-kube-api-access-8pfsl\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.416580 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40e91b3-9613-4d07-9830-5ec0279bfae3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d40e91b3-9613-4d07-9830-5ec0279bfae3" (UID: "d40e91b3-9613-4d07-9830-5ec0279bfae3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.416689 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23caf25-58e4-47ac-aa1f-a49e5df687c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f23caf25-58e4-47ac-aa1f-a49e5df687c2" (UID: "f23caf25-58e4-47ac-aa1f-a49e5df687c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.419285 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23caf25-58e4-47ac-aa1f-a49e5df687c2-kube-api-access-lnpjv" (OuterVolumeSpecName: "kube-api-access-lnpjv") pod "f23caf25-58e4-47ac-aa1f-a49e5df687c2" (UID: "f23caf25-58e4-47ac-aa1f-a49e5df687c2"). InnerVolumeSpecName "kube-api-access-lnpjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.419945 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40e91b3-9613-4d07-9830-5ec0279bfae3-kube-api-access-jfffc" (OuterVolumeSpecName: "kube-api-access-jfffc") pod "d40e91b3-9613-4d07-9830-5ec0279bfae3" (UID: "d40e91b3-9613-4d07-9830-5ec0279bfae3"). InnerVolumeSpecName "kube-api-access-jfffc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.517805 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d40e91b3-9613-4d07-9830-5ec0279bfae3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.517833 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnpjv\" (UniqueName: \"kubernetes.io/projected/f23caf25-58e4-47ac-aa1f-a49e5df687c2-kube-api-access-lnpjv\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.517844 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfffc\" (UniqueName: \"kubernetes.io/projected/d40e91b3-9613-4d07-9830-5ec0279bfae3-kube-api-access-jfffc\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.517852 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f23caf25-58e4-47ac-aa1f-a49e5df687c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.917524 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6b9b" event={"ID":"f23caf25-58e4-47ac-aa1f-a49e5df687c2","Type":"ContainerDied","Data":"ce74bdcc6559286a7cdb3e8f650fb8354ded0a97b3ea5366016f497aa9004b8f"} Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.917552 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6b9b" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.917585 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce74bdcc6559286a7cdb3e8f650fb8354ded0a97b3ea5366016f497aa9004b8f" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.919934 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9154-account-create-update-cmzk9" event={"ID":"fcc402d2-1cfe-4db2-a189-3f376e530162","Type":"ContainerDied","Data":"e77f578396babafd445aeb76e7f8f8472567c9c6f16f4cafaf93c94c5adf221e"} Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.919969 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e77f578396babafd445aeb76e7f8f8472567c9c6f16f4cafaf93c94c5adf221e" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.919985 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9154-account-create-update-cmzk9" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.922552 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dcc2-account-create-update-6b8b8" event={"ID":"d40e91b3-9613-4d07-9830-5ec0279bfae3","Type":"ContainerDied","Data":"ed6fbe0ae6c7a79e97f75e0ea1085eea2e57d5ceb8309168892f9a161796f3d9"} Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.922578 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed6fbe0ae6c7a79e97f75e0ea1085eea2e57d5ceb8309168892f9a161796f3d9" Dec 05 20:32:16 crc kubenswrapper[4904]: I1205 20:32:14.922649 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dcc2-account-create-update-6b8b8" Dec 05 20:32:17 crc kubenswrapper[4904]: I1205 20:32:17.995223 4904 generic.go:334] "Generic (PLEG): container finished" podID="73c8de0c-093e-49ca-b80a-cb990b546a50" containerID="25a406ebf7dbf4413fbcbc4dbbf6abc7b563f1092458be1a59a7f5bafa31ac27" exitCode=0 Dec 05 20:32:17 crc kubenswrapper[4904]: I1205 20:32:17.995331 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-dkcsq" event={"ID":"73c8de0c-093e-49ca-b80a-cb990b546a50","Type":"ContainerDied","Data":"25a406ebf7dbf4413fbcbc4dbbf6abc7b563f1092458be1a59a7f5bafa31ac27"} Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.010159 4904 generic.go:334] "Generic (PLEG): container finished" podID="e323a682-f130-41c5-b97b-b7cd6ab4aecf" containerID="3d5bca8cafc334fa0ef45643833874577b34af2e5400587d21e0a38e326bc937" exitCode=0 Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.010354 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6474s" event={"ID":"e323a682-f130-41c5-b97b-b7cd6ab4aecf","Type":"ContainerDied","Data":"3d5bca8cafc334fa0ef45643833874577b34af2e5400587d21e0a38e326bc937"} Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.093623 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gtfpr"] Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094241 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23caf25-58e4-47ac-aa1f-a49e5df687c2" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094257 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23caf25-58e4-47ac-aa1f-a49e5df687c2" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094275 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c42e4b0-e039-4945-8c9c-fe12766434bd" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094284 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c42e4b0-e039-4945-8c9c-fe12766434bd" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094297 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40e91b3-9613-4d07-9830-5ec0279bfae3" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094304 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40e91b3-9613-4d07-9830-5ec0279bfae3" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094323 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0451215c-c2fd-4113-b130-948cde7a8537" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094330 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="0451215c-c2fd-4113-b130-948cde7a8537" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094355 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e3a4d4-d72b-49db-92c9-214e39f84632" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094361 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e3a4d4-d72b-49db-92c9-214e39f84632" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094401 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78120578-1e01-44dd-b0ed-908e6d0103df" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094407 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="78120578-1e01-44dd-b0ed-908e6d0103df" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094419 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b02093-4f8b-4c58-be23-1bcda58307c2" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094424 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b02093-4f8b-4c58-be23-1bcda58307c2" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: E1205 20:32:19.094434 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc402d2-1cfe-4db2-a189-3f376e530162" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094440 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc402d2-1cfe-4db2-a189-3f376e530162" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094590 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23caf25-58e4-47ac-aa1f-a49e5df687c2" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094603 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="0451215c-c2fd-4113-b130-948cde7a8537" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094611 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e3a4d4-d72b-49db-92c9-214e39f84632" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094619 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40e91b3-9613-4d07-9830-5ec0279bfae3" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094629 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc402d2-1cfe-4db2-a189-3f376e530162" containerName="mariadb-account-create-update" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094639 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b02093-4f8b-4c58-be23-1bcda58307c2" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094646 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="78120578-1e01-44dd-b0ed-908e6d0103df" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.094654 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c42e4b0-e039-4945-8c9c-fe12766434bd" containerName="mariadb-database-create" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.095351 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.101565 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.101990 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwqht" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.104459 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gtfpr"] Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.249798 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-config-data\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.250027 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdvjm\" (UniqueName: \"kubernetes.io/projected/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-kube-api-access-hdvjm\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.250146 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-db-sync-config-data\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.250169 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-combined-ca-bundle\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.351183 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdvjm\" (UniqueName: \"kubernetes.io/projected/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-kube-api-access-hdvjm\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.351262 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-db-sync-config-data\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.351290 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-combined-ca-bundle\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.351341 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-config-data\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.357863 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-combined-ca-bundle\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.357872 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-db-sync-config-data\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.358223 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-config-data\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.370707 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdvjm\" (UniqueName: \"kubernetes.io/projected/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-kube-api-access-hdvjm\") pod \"glance-db-sync-gtfpr\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.438799 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.455833 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gtfpr" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.556622 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-db-sync-config-data\") pod \"73c8de0c-093e-49ca-b80a-cb990b546a50\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.556748 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-config-data\") pod \"73c8de0c-093e-49ca-b80a-cb990b546a50\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.556823 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwt5l\" (UniqueName: \"kubernetes.io/projected/73c8de0c-093e-49ca-b80a-cb990b546a50-kube-api-access-zwt5l\") pod \"73c8de0c-093e-49ca-b80a-cb990b546a50\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.556881 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-combined-ca-bundle\") pod \"73c8de0c-093e-49ca-b80a-cb990b546a50\" (UID: \"73c8de0c-093e-49ca-b80a-cb990b546a50\") " Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.565696 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c8de0c-093e-49ca-b80a-cb990b546a50-kube-api-access-zwt5l" (OuterVolumeSpecName: "kube-api-access-zwt5l") pod "73c8de0c-093e-49ca-b80a-cb990b546a50" (UID: "73c8de0c-093e-49ca-b80a-cb990b546a50"). InnerVolumeSpecName "kube-api-access-zwt5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.565732 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "73c8de0c-093e-49ca-b80a-cb990b546a50" (UID: "73c8de0c-093e-49ca-b80a-cb990b546a50"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.592617 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c8de0c-093e-49ca-b80a-cb990b546a50" (UID: "73c8de0c-093e-49ca-b80a-cb990b546a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.634922 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-config-data" (OuterVolumeSpecName: "config-data") pod "73c8de0c-093e-49ca-b80a-cb990b546a50" (UID: "73c8de0c-093e-49ca-b80a-cb990b546a50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.659186 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.659229 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwt5l\" (UniqueName: \"kubernetes.io/projected/73c8de0c-093e-49ca-b80a-cb990b546a50-kube-api-access-zwt5l\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.659243 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:19 crc kubenswrapper[4904]: I1205 20:32:19.659254 4904 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c8de0c-093e-49ca-b80a-cb990b546a50-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.018094 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-dkcsq" event={"ID":"73c8de0c-093e-49ca-b80a-cb990b546a50","Type":"ContainerDied","Data":"3ae8a4de4e1bafde3b3ad5a614e35cab9537a11d3c33149b7a69ff675e88e1bb"} Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.018446 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ae8a4de4e1bafde3b3ad5a614e35cab9537a11d3c33149b7a69ff675e88e1bb" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.018110 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-dkcsq" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.219599 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gtfpr"] Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.291038 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.473752 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwc8p\" (UniqueName: \"kubernetes.io/projected/e323a682-f130-41c5-b97b-b7cd6ab4aecf-kube-api-access-lwc8p\") pod \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.473937 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-config-data\") pod \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.473983 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-combined-ca-bundle\") pod \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\" (UID: \"e323a682-f130-41c5-b97b-b7cd6ab4aecf\") " Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.485229 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e323a682-f130-41c5-b97b-b7cd6ab4aecf-kube-api-access-lwc8p" (OuterVolumeSpecName: "kube-api-access-lwc8p") pod "e323a682-f130-41c5-b97b-b7cd6ab4aecf" (UID: "e323a682-f130-41c5-b97b-b7cd6ab4aecf"). InnerVolumeSpecName "kube-api-access-lwc8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.512810 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e323a682-f130-41c5-b97b-b7cd6ab4aecf" (UID: "e323a682-f130-41c5-b97b-b7cd6ab4aecf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.529813 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-config-data" (OuterVolumeSpecName: "config-data") pod "e323a682-f130-41c5-b97b-b7cd6ab4aecf" (UID: "e323a682-f130-41c5-b97b-b7cd6ab4aecf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.575773 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.575960 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e323a682-f130-41c5-b97b-b7cd6ab4aecf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:20 crc kubenswrapper[4904]: I1205 20:32:20.576014 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwc8p\" (UniqueName: \"kubernetes.io/projected/e323a682-f130-41c5-b97b-b7cd6ab4aecf-kube-api-access-lwc8p\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.034955 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gtfpr" event={"ID":"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b","Type":"ContainerStarted","Data":"d9300f073ef334c06995edf5d98d302ca5d641357ac18a7c9a0fcf737d59bbb1"} Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.037164 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6474s" event={"ID":"e323a682-f130-41c5-b97b-b7cd6ab4aecf","Type":"ContainerDied","Data":"1cc188c9ab604db095fcf8b868c5a2a56bbfb2edbc6b69a50141208b82bdf72b"} Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.037190 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc188c9ab604db095fcf8b868c5a2a56bbfb2edbc6b69a50141208b82bdf72b" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.037238 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6474s" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.292811 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc95f9cc5-pvn9n"] Dec 05 20:32:21 crc kubenswrapper[4904]: E1205 20:32:21.293257 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e323a682-f130-41c5-b97b-b7cd6ab4aecf" containerName="keystone-db-sync" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.293270 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e323a682-f130-41c5-b97b-b7cd6ab4aecf" containerName="keystone-db-sync" Dec 05 20:32:21 crc kubenswrapper[4904]: E1205 20:32:21.293302 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c8de0c-093e-49ca-b80a-cb990b546a50" containerName="watcher-db-sync" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.293308 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c8de0c-093e-49ca-b80a-cb990b546a50" containerName="watcher-db-sync" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.293478 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c8de0c-093e-49ca-b80a-cb990b546a50" containerName="watcher-db-sync" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.293492 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e323a682-f130-41c5-b97b-b7cd6ab4aecf" containerName="keystone-db-sync" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.294504 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.327514 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc95f9cc5-pvn9n"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.392565 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-svc\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.392647 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.392679 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q2wm\" (UniqueName: \"kubernetes.io/projected/927c6571-a9d1-45af-b5de-44dea68ee507-kube-api-access-5q2wm\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.392708 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.392745 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.392777 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-config\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.399286 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gjvc9"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.400713 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.408574 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.408729 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.408787 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.409034 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ddf7h" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.409167 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.449732 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gjvc9"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.494976 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-config\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495261 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-scripts\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495305 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-config-data\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495328 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495343 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-svc\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495361 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-combined-ca-bundle\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495375 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5nls\" (UniqueName: \"kubernetes.io/projected/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-kube-api-access-s5nls\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495393 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-credential-keys\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495414 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q2wm\" (UniqueName: \"kubernetes.io/projected/927c6571-a9d1-45af-b5de-44dea68ee507-kube-api-access-5q2wm\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495435 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495465 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.495489 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-fernet-keys\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.496462 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-config\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.497026 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.497579 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-svc\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.497910 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.498158 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.528599 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.529699 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.536392 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-w4g9x" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.536531 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.552716 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.553048 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q2wm\" (UniqueName: \"kubernetes.io/projected/927c6571-a9d1-45af-b5de-44dea68ee507-kube-api-access-5q2wm\") pod \"dnsmasq-dns-5cc95f9cc5-pvn9n\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.596428 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-scripts\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.596485 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-config-data\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.596513 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-combined-ca-bundle\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.596530 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5nls\" (UniqueName: \"kubernetes.io/projected/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-kube-api-access-s5nls\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.596547 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-credential-keys\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.596603 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-fernet-keys\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.625955 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-fernet-keys\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.626559 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-config-data\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.626689 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-credential-keys\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.628495 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-scripts\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.632475 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.643931 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-combined-ca-bundle\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.648734 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5nls\" (UniqueName: \"kubernetes.io/projected/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-kube-api-access-s5nls\") pod \"keystone-bootstrap-gjvc9\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.698887 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.698931 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbb45\" (UniqueName: \"kubernetes.io/projected/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-kube-api-access-mbb45\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.698959 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-logs\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.699007 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-config-data\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.741499 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.744986 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.745927 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.746004 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.760302 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-547f6dfd9c-wt8l4"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.761679 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.767375 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.777327 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pjlzn"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.778446 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.781079 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pjlzn"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.786027 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-x6lfb" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.786077 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.786130 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.786082 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.793477 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.793910 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.794386 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bpllb" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800019 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800094 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800123 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-config-data\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800188 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8698908a-a9e4-46c2-9855-2a1db92b1d75-logs\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800213 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-528kg\" (UniqueName: \"kubernetes.io/projected/8698908a-a9e4-46c2-9855-2a1db92b1d75-kube-api-access-528kg\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800275 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800303 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800330 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbb45\" (UniqueName: \"kubernetes.io/projected/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-kube-api-access-mbb45\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.800379 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-logs\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.805394 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-config-data\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.805804 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-547f6dfd9c-wt8l4"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.807918 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.808749 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-logs\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.842573 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-k6l4d"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.853751 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.882484 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-m695p" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.882917 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.890129 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbb45\" (UniqueName: \"kubernetes.io/projected/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-kube-api-access-mbb45\") pod \"watcher-applier-0\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " pod="openstack/watcher-applier-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.893839 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wh6jz"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.895140 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901315 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-combined-ca-bundle\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901365 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901387 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vgsp\" (UniqueName: \"kubernetes.io/projected/b72d1aa8-3933-4153-89ac-a4ffe0667268-kube-api-access-6vgsp\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901415 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901450 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-config\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901467 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxvs5\" (UniqueName: \"kubernetes.io/projected/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-kube-api-access-bxvs5\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901498 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-scripts\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901525 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8698908a-a9e4-46c2-9855-2a1db92b1d75-logs\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901544 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-528kg\" (UniqueName: \"kubernetes.io/projected/8698908a-a9e4-46c2-9855-2a1db92b1d75-kube-api-access-528kg\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901572 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-db-sync-config-data\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901610 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901632 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2fs\" (UniqueName: \"kubernetes.io/projected/76209519-a745-4eab-9d5d-c330ffb29191-kube-api-access-zb2fs\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901656 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-config-data\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901713 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-combined-ca-bundle\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901735 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-logs\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901773 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-horizon-secret-key\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.901949 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k6l4d"] Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.902673 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8698908a-a9e4-46c2-9855-2a1db92b1d75-logs\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.906724 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.906986 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.907203 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wx7jh" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.907778 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.908876 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.918223 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.942417 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-528kg\" (UniqueName: \"kubernetes.io/projected/8698908a-a9e4-46c2-9855-2a1db92b1d75-kube-api-access-528kg\") pod \"watcher-decision-engine-0\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:32:21 crc kubenswrapper[4904]: I1205 20:32:21.978087 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:21.999483 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.006820 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.007257 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.008648 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-combined-ca-bundle\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.008750 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0a41473-f3de-440f-89be-9fddf77f6148-etc-machine-id\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.008817 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vgsp\" (UniqueName: \"kubernetes.io/projected/b72d1aa8-3933-4153-89ac-a4ffe0667268-kube-api-access-6vgsp\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.008881 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-config-data\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.008996 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxvs5\" (UniqueName: \"kubernetes.io/projected/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-kube-api-access-bxvs5\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009076 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-config\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009159 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-scripts\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009244 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-db-sync-config-data\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009313 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-scripts\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009401 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2fs\" (UniqueName: \"kubernetes.io/projected/76209519-a745-4eab-9d5d-c330ffb29191-kube-api-access-zb2fs\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009470 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-config-data\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009542 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-combined-ca-bundle\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009611 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-logs\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009690 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rztw\" (UniqueName: \"kubernetes.io/projected/f0a41473-f3de-440f-89be-9fddf77f6148-kube-api-access-9rztw\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009768 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-combined-ca-bundle\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009836 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-db-sync-config-data\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.009909 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-horizon-secret-key\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.014607 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-horizon-secret-key\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.020452 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-scripts\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.028493 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-db-sync-config-data\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.032261 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-config-data\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.037951 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-combined-ca-bundle\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.049195 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-logs\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.067919 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-config\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.076668 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-combined-ca-bundle\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.086974 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.095774 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2fs\" (UniqueName: \"kubernetes.io/projected/76209519-a745-4eab-9d5d-c330ffb29191-kube-api-access-zb2fs\") pod \"neutron-db-sync-pjlzn\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.096633 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vgsp\" (UniqueName: \"kubernetes.io/projected/b72d1aa8-3933-4153-89ac-a4ffe0667268-kube-api-access-6vgsp\") pod \"barbican-db-sync-k6l4d\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.099611 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxvs5\" (UniqueName: \"kubernetes.io/projected/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-kube-api-access-bxvs5\") pod \"horizon-547f6dfd9c-wt8l4\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.107940 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.113141 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.113527 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-scripts\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.113702 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-scripts\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.113808 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhbm\" (UniqueName: \"kubernetes.io/projected/81a8e33f-1179-4192-9d21-b0f520c41656-kube-api-access-crhbm\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114014 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rztw\" (UniqueName: \"kubernetes.io/projected/f0a41473-f3de-440f-89be-9fddf77f6148-kube-api-access-9rztw\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114138 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-log-httpd\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114227 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-run-httpd\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114348 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-combined-ca-bundle\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114448 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-db-sync-config-data\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114568 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0a41473-f3de-440f-89be-9fddf77f6148-etc-machine-id\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114673 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-config-data\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114774 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.114941 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-config-data\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.121633 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-scripts\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.123033 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0a41473-f3de-440f-89be-9fddf77f6148-etc-machine-id\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.123914 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.191304 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-db-sync-config-data\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.191463 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.191820 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.192632 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.219397 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-config-data\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.219891 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-combined-ca-bundle\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.223397 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wh6jz"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.234391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-run-httpd\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.234545 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa95b130-3627-4fa6-9bf8-1760d3b24843-logs\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.234586 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-config-data\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.234658 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.234789 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-config-data\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.236847 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.253775 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.253828 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.253903 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-scripts\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.253937 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crhbm\" (UniqueName: \"kubernetes.io/projected/81a8e33f-1179-4192-9d21-b0f520c41656-kube-api-access-crhbm\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.254033 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.254240 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-log-httpd\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.254292 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpj4d\" (UniqueName: \"kubernetes.io/projected/fa95b130-3627-4fa6-9bf8-1760d3b24843-kube-api-access-bpj4d\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.254599 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-run-httpd\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.254731 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.258406 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rztw\" (UniqueName: \"kubernetes.io/projected/f0a41473-f3de-440f-89be-9fddf77f6148-kube-api-access-9rztw\") pod \"cinder-db-sync-wh6jz\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.258806 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-log-httpd\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.261637 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-scripts\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.264146 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.264433 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-config-data\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.265451 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.273341 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.280217 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhbm\" (UniqueName: \"kubernetes.io/projected/81a8e33f-1179-4192-9d21-b0f520c41656-kube-api-access-crhbm\") pod \"ceilometer-0\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.315423 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.339579 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.345455 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79945c76df-zwn2h"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.347105 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.356378 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.356463 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.356488 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpj4d\" (UniqueName: \"kubernetes.io/projected/fa95b130-3627-4fa6-9bf8-1760d3b24843-kube-api-access-bpj4d\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.356524 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa95b130-3627-4fa6-9bf8-1760d3b24843-logs\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.356543 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-config-data\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.358692 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa95b130-3627-4fa6-9bf8-1760d3b24843-logs\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.362046 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-config-data\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.363846 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.370001 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.371250 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4zhc6"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.372429 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.377000 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.377134 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sjsm8" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.377619 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpj4d\" (UniqueName: \"kubernetes.io/projected/fa95b130-3627-4fa6-9bf8-1760d3b24843-kube-api-access-bpj4d\") pod \"watcher-api-0\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.382026 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.402196 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.447838 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4zhc6"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458612 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-config-data\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458663 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-horizon-secret-key\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458710 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-scripts\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458748 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kb25\" (UniqueName: \"kubernetes.io/projected/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-kube-api-access-9kb25\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458792 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-logs\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458816 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-combined-ca-bundle\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458851 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-logs\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458871 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-config-data\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458916 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxb5n\" (UniqueName: \"kubernetes.io/projected/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-kube-api-access-rxb5n\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.458991 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-scripts\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.485901 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc95f9cc5-pvn9n"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.500898 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79945c76df-zwn2h"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.521022 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8b46f59-njvnc"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.522701 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.554503 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.559318 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8b46f59-njvnc"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561793 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kb25\" (UniqueName: \"kubernetes.io/projected/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-kube-api-access-9kb25\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561859 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561894 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-logs\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561923 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-combined-ca-bundle\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561953 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-logs\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561972 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-config-data\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.561988 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562025 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxb5n\" (UniqueName: \"kubernetes.io/projected/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-kube-api-access-rxb5n\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562043 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562080 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzkf8\" (UniqueName: \"kubernetes.io/projected/659969bf-c700-4cb7-b8c0-1d30873b3d0a-kube-api-access-rzkf8\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562142 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-scripts\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562184 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-config-data\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562212 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-svc\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562244 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-horizon-secret-key\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562303 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-config\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.562328 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-scripts\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.563193 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-logs\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.563499 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-scripts\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.563741 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-logs\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.564837 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-config-data\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.570162 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-combined-ca-bundle\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.574029 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-scripts\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.574676 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-config-data\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.586633 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kb25\" (UniqueName: \"kubernetes.io/projected/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-kube-api-access-9kb25\") pod \"placement-db-sync-4zhc6\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.588870 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxb5n\" (UniqueName: \"kubernetes.io/projected/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-kube-api-access-rxb5n\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.601372 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-horizon-secret-key\") pod \"horizon-79945c76df-zwn2h\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.667243 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-svc\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.667288 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-config\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.667334 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.667374 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.667412 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.667429 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzkf8\" (UniqueName: \"kubernetes.io/projected/659969bf-c700-4cb7-b8c0-1d30873b3d0a-kube-api-access-rzkf8\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.668395 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.668593 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.668688 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-config\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.669004 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.683967 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-svc\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.701270 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzkf8\" (UniqueName: \"kubernetes.io/projected/659969bf-c700-4cb7-b8c0-1d30873b3d0a-kube-api-access-rzkf8\") pod \"dnsmasq-dns-8b8b46f59-njvnc\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.701648 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc95f9cc5-pvn9n"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.715359 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:22 crc kubenswrapper[4904]: W1205 20:32:22.728920 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod927c6571_a9d1_45af_b5de_44dea68ee507.slice/crio-8a04022751f7c16ef1846139d5c02facc6e821f5d4aa02eb9590e5fdf3585f15 WatchSource:0}: Error finding container 8a04022751f7c16ef1846139d5c02facc6e821f5d4aa02eb9590e5fdf3585f15: Status 404 returned error can't find the container with id 8a04022751f7c16ef1846139d5c02facc6e821f5d4aa02eb9590e5fdf3585f15 Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.737052 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4zhc6" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.805439 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.867222 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.906534 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gjvc9"] Dec 05 20:32:22 crc kubenswrapper[4904]: I1205 20:32:22.938650 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.147266 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" event={"ID":"927c6571-a9d1-45af-b5de-44dea68ee507","Type":"ContainerStarted","Data":"8a04022751f7c16ef1846139d5c02facc6e821f5d4aa02eb9590e5fdf3585f15"} Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.157238 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gjvc9" event={"ID":"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd","Type":"ContainerStarted","Data":"de1968e25f28bc81ffeb7741c6c5354ab79b4f03eca8b6cd7041c613d1e95913"} Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.168831 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d","Type":"ContainerStarted","Data":"9f98c081d54208eccea9bcd76c5e464e485f61a0fe1bf4d3d431028bd6ef1e5f"} Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.171675 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8698908a-a9e4-46c2-9855-2a1db92b1d75","Type":"ContainerStarted","Data":"5407cf6346261137ed512d6574446e10d30922554bdeb542fd03dfc9d3a200ce"} Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.209242 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-547f6dfd9c-wt8l4"] Dec 05 20:32:23 crc kubenswrapper[4904]: W1205 20:32:23.275304 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod561a2fa1_4349_438a_9bfd_8c8c6e3e3a73.slice/crio-9486b448461577e0070f33bcb3c9e8c9507f8a8aa37aefdf2671dee5427ae881 WatchSource:0}: Error finding container 9486b448461577e0070f33bcb3c9e8c9507f8a8aa37aefdf2671dee5427ae881: Status 404 returned error can't find the container with id 9486b448461577e0070f33bcb3c9e8c9507f8a8aa37aefdf2671dee5427ae881 Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.723470 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k6l4d"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.723759 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.725837 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pjlzn"] Dec 05 20:32:23 crc kubenswrapper[4904]: W1205 20:32:23.752940 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb72d1aa8_3933_4153_89ac_a4ffe0667268.slice/crio-0323224a151a438a0b3d8cb2253948ab6572fefe2052619b9271fedc646898a4 WatchSource:0}: Error finding container 0323224a151a438a0b3d8cb2253948ab6572fefe2052619b9271fedc646898a4: Status 404 returned error can't find the container with id 0323224a151a438a0b3d8cb2253948ab6572fefe2052619b9271fedc646898a4 Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.776280 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wh6jz"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.807180 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4zhc6"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.919904 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.937154 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79945c76df-zwn2h"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.954192 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8b46f59-njvnc"] Dec 05 20:32:23 crc kubenswrapper[4904]: I1205 20:32:23.959929 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.014619 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547f6dfd9c-wt8l4"] Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.070594 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-cf6d474f7-dfz4j"] Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.090026 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.108901 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cf6d474f7-dfz4j"] Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.141187 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.183631 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wh6jz" event={"ID":"f0a41473-f3de-440f-89be-9fddf77f6148","Type":"ContainerStarted","Data":"66192db94a93fb0ef363e7e41e36cb6dd48c7d9fd5ee92e62353ca4b8acb5f61"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.188267 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79945c76df-zwn2h" event={"ID":"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad","Type":"ContainerStarted","Data":"dbcd5c01e575ceecc3c3a14a6175d0c52aef33433084da6ac78ac6f85e33274c"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.189131 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pjlzn" event={"ID":"76209519-a745-4eab-9d5d-c330ffb29191","Type":"ContainerStarted","Data":"5ef354e23bb8e738c9786ec28118bb8e970cd525422c2c94137866c097dfadfa"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.189827 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k6l4d" event={"ID":"b72d1aa8-3933-4153-89ac-a4ffe0667268","Type":"ContainerStarted","Data":"0323224a151a438a0b3d8cb2253948ab6572fefe2052619b9271fedc646898a4"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.200284 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" event={"ID":"659969bf-c700-4cb7-b8c0-1d30873b3d0a","Type":"ContainerStarted","Data":"8e73b9edb0438fc9074d15932e1ee6c8c33162cc689ea50a15cd18c5de6b6606"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.204573 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerStarted","Data":"f3c09fff7467111eb288365f5fa840b91d63c88891f5a9b9aebe8f70f37bf9a9"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.230954 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"fa95b130-3627-4fa6-9bf8-1760d3b24843","Type":"ContainerStarted","Data":"852cad448e2f432f65fdb4baeb48cc80136eeea5ba57394a7e20ab4d6a9580fc"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.234404 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gjvc9" event={"ID":"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd","Type":"ContainerStarted","Data":"f2627d2e6a793ad96a31df3f93d55432e268348872a9265f0c6eb7c6070f7bfa"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.238345 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-config-data\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.238452 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70255ecd-6213-48cd-bd22-e6e14bbc497d-horizon-secret-key\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.238565 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-scripts\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.238687 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gwww\" (UniqueName: \"kubernetes.io/projected/70255ecd-6213-48cd-bd22-e6e14bbc497d-kube-api-access-4gwww\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.238746 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70255ecd-6213-48cd-bd22-e6e14bbc497d-logs\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.247436 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f6dfd9c-wt8l4" event={"ID":"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73","Type":"ContainerStarted","Data":"9486b448461577e0070f33bcb3c9e8c9507f8a8aa37aefdf2671dee5427ae881"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.248555 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4zhc6" event={"ID":"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8","Type":"ContainerStarted","Data":"ac7058d6f6ed2cd1df84046a58a1e467405a1effc8e5b44b13a995d1f92afb6f"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.250093 4904 generic.go:334] "Generic (PLEG): container finished" podID="927c6571-a9d1-45af-b5de-44dea68ee507" containerID="274bef6b0587a6fa0b8e8de7ee6c6392e3de22a5efed0ffd78bdc201ad0affa4" exitCode=0 Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.250133 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" event={"ID":"927c6571-a9d1-45af-b5de-44dea68ee507","Type":"ContainerDied","Data":"274bef6b0587a6fa0b8e8de7ee6c6392e3de22a5efed0ffd78bdc201ad0affa4"} Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.265199 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gjvc9" podStartSLOduration=3.26517886 podStartE2EDuration="3.26517886s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:24.257298552 +0000 UTC m=+1243.068514661" watchObservedRunningTime="2025-12-05 20:32:24.26517886 +0000 UTC m=+1243.076394969" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.340788 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-scripts\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.340919 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gwww\" (UniqueName: \"kubernetes.io/projected/70255ecd-6213-48cd-bd22-e6e14bbc497d-kube-api-access-4gwww\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.340960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70255ecd-6213-48cd-bd22-e6e14bbc497d-logs\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.341049 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-config-data\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.341114 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70255ecd-6213-48cd-bd22-e6e14bbc497d-horizon-secret-key\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.341898 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-scripts\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.343108 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-config-data\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.344519 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70255ecd-6213-48cd-bd22-e6e14bbc497d-logs\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.347746 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70255ecd-6213-48cd-bd22-e6e14bbc497d-horizon-secret-key\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.396080 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gwww\" (UniqueName: \"kubernetes.io/projected/70255ecd-6213-48cd-bd22-e6e14bbc497d-kube-api-access-4gwww\") pod \"horizon-cf6d474f7-dfz4j\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:24 crc kubenswrapper[4904]: I1205 20:32:24.468523 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.275123 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pjlzn" event={"ID":"76209519-a745-4eab-9d5d-c330ffb29191","Type":"ContainerStarted","Data":"fc7821d1b2305b566cab8bfc36fd3d6f720db61bdbdb4f4b42ac6e8dc25c2159"} Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.290626 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"fa95b130-3627-4fa6-9bf8-1760d3b24843","Type":"ContainerStarted","Data":"30e14b92445197a4e38e382b251c4ecbc6bcefaca264e118c6c56ba34e8733a5"} Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.291538 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pjlzn" podStartSLOduration=4.29151755 podStartE2EDuration="4.29151755s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:25.289514427 +0000 UTC m=+1244.100730546" watchObservedRunningTime="2025-12-05 20:32:25.29151755 +0000 UTC m=+1244.102733659" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.299459 4904 generic.go:334] "Generic (PLEG): container finished" podID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerID="5d160d2fc4e3ad62f171aeb938e7c2b454212b691353f4e89466b0ed5237d53a" exitCode=0 Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.299511 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" event={"ID":"659969bf-c700-4cb7-b8c0-1d30873b3d0a","Type":"ContainerDied","Data":"5d160d2fc4e3ad62f171aeb938e7c2b454212b691353f4e89466b0ed5237d53a"} Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.704176 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.767174 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-sb\") pod \"927c6571-a9d1-45af-b5de-44dea68ee507\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.767299 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q2wm\" (UniqueName: \"kubernetes.io/projected/927c6571-a9d1-45af-b5de-44dea68ee507-kube-api-access-5q2wm\") pod \"927c6571-a9d1-45af-b5de-44dea68ee507\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.767333 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-config\") pod \"927c6571-a9d1-45af-b5de-44dea68ee507\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.767445 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-swift-storage-0\") pod \"927c6571-a9d1-45af-b5de-44dea68ee507\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.767476 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-svc\") pod \"927c6571-a9d1-45af-b5de-44dea68ee507\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.767509 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-nb\") pod \"927c6571-a9d1-45af-b5de-44dea68ee507\" (UID: \"927c6571-a9d1-45af-b5de-44dea68ee507\") " Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.792095 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927c6571-a9d1-45af-b5de-44dea68ee507-kube-api-access-5q2wm" (OuterVolumeSpecName: "kube-api-access-5q2wm") pod "927c6571-a9d1-45af-b5de-44dea68ee507" (UID: "927c6571-a9d1-45af-b5de-44dea68ee507"). InnerVolumeSpecName "kube-api-access-5q2wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.839754 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "927c6571-a9d1-45af-b5de-44dea68ee507" (UID: "927c6571-a9d1-45af-b5de-44dea68ee507"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.877385 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.877439 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q2wm\" (UniqueName: \"kubernetes.io/projected/927c6571-a9d1-45af-b5de-44dea68ee507-kube-api-access-5q2wm\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.882591 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-config" (OuterVolumeSpecName: "config") pod "927c6571-a9d1-45af-b5de-44dea68ee507" (UID: "927c6571-a9d1-45af-b5de-44dea68ee507"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.889524 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "927c6571-a9d1-45af-b5de-44dea68ee507" (UID: "927c6571-a9d1-45af-b5de-44dea68ee507"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.903945 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "927c6571-a9d1-45af-b5de-44dea68ee507" (UID: "927c6571-a9d1-45af-b5de-44dea68ee507"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.911565 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "927c6571-a9d1-45af-b5de-44dea68ee507" (UID: "927c6571-a9d1-45af-b5de-44dea68ee507"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.979367 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.979419 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.979432 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:25 crc kubenswrapper[4904]: I1205 20:32:25.979446 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927c6571-a9d1-45af-b5de-44dea68ee507-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:26 crc kubenswrapper[4904]: I1205 20:32:26.312859 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" event={"ID":"927c6571-a9d1-45af-b5de-44dea68ee507","Type":"ContainerDied","Data":"8a04022751f7c16ef1846139d5c02facc6e821f5d4aa02eb9590e5fdf3585f15"} Dec 05 20:32:26 crc kubenswrapper[4904]: I1205 20:32:26.312922 4904 scope.go:117] "RemoveContainer" containerID="274bef6b0587a6fa0b8e8de7ee6c6392e3de22a5efed0ffd78bdc201ad0affa4" Dec 05 20:32:26 crc kubenswrapper[4904]: I1205 20:32:26.312995 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc95f9cc5-pvn9n" Dec 05 20:32:26 crc kubenswrapper[4904]: I1205 20:32:26.393292 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc95f9cc5-pvn9n"] Dec 05 20:32:26 crc kubenswrapper[4904]: I1205 20:32:26.438184 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cc95f9cc5-pvn9n"] Dec 05 20:32:27 crc kubenswrapper[4904]: I1205 20:32:27.698895 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927c6571-a9d1-45af-b5de-44dea68ee507" path="/var/lib/kubelet/pods/927c6571-a9d1-45af-b5de-44dea68ee507/volumes" Dec 05 20:32:28 crc kubenswrapper[4904]: I1205 20:32:28.336029 4904 generic.go:334] "Generic (PLEG): container finished" podID="f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" containerID="f2627d2e6a793ad96a31df3f93d55432e268348872a9265f0c6eb7c6070f7bfa" exitCode=0 Dec 05 20:32:28 crc kubenswrapper[4904]: I1205 20:32:28.336101 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gjvc9" event={"ID":"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd","Type":"ContainerDied","Data":"f2627d2e6a793ad96a31df3f93d55432e268348872a9265f0c6eb7c6070f7bfa"} Dec 05 20:32:29 crc kubenswrapper[4904]: I1205 20:32:29.956072 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:32:29 crc kubenswrapper[4904]: I1205 20:32:29.956657 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.371279 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79945c76df-zwn2h"] Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.392829 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6867ddbddb-4lg6w"] Dec 05 20:32:30 crc kubenswrapper[4904]: E1205 20:32:30.393252 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927c6571-a9d1-45af-b5de-44dea68ee507" containerName="init" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.393270 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="927c6571-a9d1-45af-b5de-44dea68ee507" containerName="init" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.393648 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="927c6571-a9d1-45af-b5de-44dea68ee507" containerName="init" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.394657 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.397787 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.425100 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6867ddbddb-4lg6w"] Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.483172 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cf6d474f7-dfz4j"] Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504210 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-tls-certs\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504270 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-logs\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504297 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjr7\" (UniqueName: \"kubernetes.io/projected/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-kube-api-access-zqjr7\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504329 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-secret-key\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504388 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-config-data\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504408 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-scripts\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.504429 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-combined-ca-bundle\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.548835 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d867d46cb-9mdx2"] Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.550344 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.573951 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d867d46cb-9mdx2"] Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606117 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-config-data\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606163 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-scripts\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606191 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-combined-ca-bundle\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606238 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-tls-certs\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606272 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-logs\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606296 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjr7\" (UniqueName: \"kubernetes.io/projected/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-kube-api-access-zqjr7\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.606333 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-secret-key\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.625684 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-scripts\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.648234 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-combined-ca-bundle\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.652076 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-config-data\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.653415 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-tls-certs\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.653500 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-secret-key\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.653542 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-logs\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.662919 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjr7\" (UniqueName: \"kubernetes.io/projected/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-kube-api-access-zqjr7\") pod \"horizon-6867ddbddb-4lg6w\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711269 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4b4j\" (UniqueName: \"kubernetes.io/projected/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-kube-api-access-n4b4j\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711359 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-config-data\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711388 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-horizon-tls-certs\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711432 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-logs\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711469 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-horizon-secret-key\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711533 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-combined-ca-bundle\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.711577 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-scripts\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.723832 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.813097 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-scripts\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.813778 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-scripts\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.813212 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4b4j\" (UniqueName: \"kubernetes.io/projected/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-kube-api-access-n4b4j\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.813871 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-config-data\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.813935 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-horizon-tls-certs\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.814079 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-logs\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.814169 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-horizon-secret-key\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.814255 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-combined-ca-bundle\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.814612 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-logs\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.814806 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-config-data\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.817323 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-horizon-tls-certs\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.818570 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-combined-ca-bundle\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.834573 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-horizon-secret-key\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.834795 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4b4j\" (UniqueName: \"kubernetes.io/projected/ffe3f1f4-8d49-4bf8-a088-e3a930ddc614-kube-api-access-n4b4j\") pod \"horizon-d867d46cb-9mdx2\" (UID: \"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614\") " pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:30 crc kubenswrapper[4904]: I1205 20:32:30.917010 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:32:43 crc kubenswrapper[4904]: E1205 20:32:43.032333 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 05 20:32:43 crc kubenswrapper[4904]: E1205 20:32:43.032893 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 05 20:32:43 crc kubenswrapper[4904]: E1205 20:32:43.033036 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n684h5d5h54ch564h65h599h99h57fh5dh656h67chbchd8h586h54bh96h59dh94h5f6h57h54ch645h7bh5dfh655h75h574h7h9ch56h679h6dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxb5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-79945c76df-zwn2h_openstack(e1c6c734-6fb7-4cb5-a292-aba1c0d2adad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:32:43 crc kubenswrapper[4904]: E1205 20:32:43.036078 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-79945c76df-zwn2h" podUID="e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.915217 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.915727 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.915829 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n685h9h8ch678h54bh564h5d6h586h59dh74h68h55dh584hcfh6dh679h595h576h6fh64bh674h5fch5c6hcch57ch559hb8h68dh8h588h8bh6cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxvs5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-547f6dfd9c-wt8l4_openstack(561a2fa1-4349-438a-9bfd-8c8c6e3e3a73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.918002 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-547f6dfd9c-wt8l4" podUID="561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.920956 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.920996 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.921124 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.9:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rztw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wh6jz_openstack(f0a41473-f3de-440f-89be-9fddf77f6148): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:32:55 crc kubenswrapper[4904]: E1205 20:32:55.923149 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wh6jz" podUID="f0a41473-f3de-440f-89be-9fddf77f6148" Dec 05 20:32:55 crc kubenswrapper[4904]: I1205 20:32:55.996866 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.004266 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195524 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-combined-ca-bundle\") pod \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195585 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-logs\") pod \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195653 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-config-data\") pod \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195680 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-fernet-keys\") pod \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195700 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-scripts\") pod \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195740 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5nls\" (UniqueName: \"kubernetes.io/projected/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-kube-api-access-s5nls\") pod \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195800 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-horizon-secret-key\") pod \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195827 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxb5n\" (UniqueName: \"kubernetes.io/projected/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-kube-api-access-rxb5n\") pod \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195858 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-scripts\") pod \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\" (UID: \"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195937 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-credential-keys\") pod \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.195964 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-config-data\") pod \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\" (UID: \"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd\") " Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.196739 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-logs" (OuterVolumeSpecName: "logs") pod "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" (UID: "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.197257 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-scripts" (OuterVolumeSpecName: "scripts") pod "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" (UID: "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.197385 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-config-data" (OuterVolumeSpecName: "config-data") pod "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" (UID: "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.201719 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" (UID: "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.202497 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" (UID: "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.202541 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-kube-api-access-rxb5n" (OuterVolumeSpecName: "kube-api-access-rxb5n") pod "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" (UID: "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad"). InnerVolumeSpecName "kube-api-access-rxb5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.202583 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" (UID: "e1c6c734-6fb7-4cb5-a292-aba1c0d2adad"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.205234 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-scripts" (OuterVolumeSpecName: "scripts") pod "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" (UID: "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.206683 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-kube-api-access-s5nls" (OuterVolumeSpecName: "kube-api-access-s5nls") pod "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" (UID: "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd"). InnerVolumeSpecName "kube-api-access-s5nls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.234967 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" (UID: "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.237774 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-config-data" (OuterVolumeSpecName: "config-data") pod "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" (UID: "f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300296 4904 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300331 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300344 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300358 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300369 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300379 4904 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300392 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300406 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5nls\" (UniqueName: \"kubernetes.io/projected/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd-kube-api-access-s5nls\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300417 4904 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300428 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxb5n\" (UniqueName: \"kubernetes.io/projected/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-kube-api-access-rxb5n\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.300439 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:56 crc kubenswrapper[4904]: E1205 20:32:56.398808 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Dec 05 20:32:56 crc kubenswrapper[4904]: E1205 20:32:56.399180 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Dec 05 20:32:56 crc kubenswrapper[4904]: E1205 20:32:56.399330 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.9:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59bhcch596h5c5h5f9hdhd9hd8h5ffh599hdh5dch547h65dh549hbdh587h5f6h66bhf5h54fh6chbbh5cfh66dh8ch5b9h656h65h5b5h65h5cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-crhbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(81a8e33f-1179-4192-9d21-b0f520c41656): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.682344 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gjvc9" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.682348 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gjvc9" event={"ID":"f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd","Type":"ContainerDied","Data":"de1968e25f28bc81ffeb7741c6c5354ab79b4f03eca8b6cd7041c613d1e95913"} Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.682405 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de1968e25f28bc81ffeb7741c6c5354ab79b4f03eca8b6cd7041c613d1e95913" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.684344 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79945c76df-zwn2h" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.684389 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79945c76df-zwn2h" event={"ID":"e1c6c734-6fb7-4cb5-a292-aba1c0d2adad","Type":"ContainerDied","Data":"dbcd5c01e575ceecc3c3a14a6175d0c52aef33433084da6ac78ac6f85e33274c"} Dec 05 20:32:56 crc kubenswrapper[4904]: E1205 20:32:56.686277 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-wh6jz" podUID="f0a41473-f3de-440f-89be-9fddf77f6148" Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.786656 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79945c76df-zwn2h"] Dec 05 20:32:56 crc kubenswrapper[4904]: I1205 20:32:56.798120 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79945c76df-zwn2h"] Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.162873 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gjvc9"] Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.170517 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gjvc9"] Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.256447 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ft95f"] Dec 05 20:32:57 crc kubenswrapper[4904]: E1205 20:32:57.256821 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" containerName="keystone-bootstrap" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.256841 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" containerName="keystone-bootstrap" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.257084 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" containerName="keystone-bootstrap" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.257900 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.261563 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.261788 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.261955 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ddf7h" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.262180 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.266864 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ft95f"] Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.270097 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.429843 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jkl8\" (UniqueName: \"kubernetes.io/projected/e046612d-8016-4321-8a7a-a14b14f68e91-kube-api-access-8jkl8\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.429923 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-credential-keys\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.430042 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-config-data\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.430307 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-fernet-keys\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.430441 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-scripts\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.430494 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-combined-ca-bundle\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.533175 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jkl8\" (UniqueName: \"kubernetes.io/projected/e046612d-8016-4321-8a7a-a14b14f68e91-kube-api-access-8jkl8\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.533240 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-credential-keys\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.533288 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-config-data\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.533372 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-fernet-keys\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.533434 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-scripts\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.533463 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-combined-ca-bundle\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.540212 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-fernet-keys\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.540657 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-credential-keys\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.541026 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-config-data\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.547301 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-scripts\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.555679 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-combined-ca-bundle\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.555931 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jkl8\" (UniqueName: \"kubernetes.io/projected/e046612d-8016-4321-8a7a-a14b14f68e91-kube-api-access-8jkl8\") pod \"keystone-bootstrap-ft95f\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.582657 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.693535 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c6c734-6fb7-4cb5-a292-aba1c0d2adad" path="/var/lib/kubelet/pods/e1c6c734-6fb7-4cb5-a292-aba1c0d2adad/volumes" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.694034 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd" path="/var/lib/kubelet/pods/f6d2e5f6-85ec-4e70-aeb5-e2af2e970edd/volumes" Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.697478 4904 generic.go:334] "Generic (PLEG): container finished" podID="76209519-a745-4eab-9d5d-c330ffb29191" containerID="fc7821d1b2305b566cab8bfc36fd3d6f720db61bdbdb4f4b42ac6e8dc25c2159" exitCode=0 Dec 05 20:32:57 crc kubenswrapper[4904]: I1205 20:32:57.697513 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pjlzn" event={"ID":"76209519-a745-4eab-9d5d-c330ffb29191","Type":"ContainerDied","Data":"fc7821d1b2305b566cab8bfc36fd3d6f720db61bdbdb4f4b42ac6e8dc25c2159"} Dec 05 20:32:59 crc kubenswrapper[4904]: I1205 20:32:59.958265 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:32:59 crc kubenswrapper[4904]: I1205 20:32:59.958910 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:32:59 crc kubenswrapper[4904]: I1205 20:32:59.958963 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:32:59 crc kubenswrapper[4904]: I1205 20:32:59.959877 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"528bac05d1388e0a638d4e21540a1e20bd15f6ec7b5238ce57fc2bc9737af1ba"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:32:59 crc kubenswrapper[4904]: I1205 20:32:59.959972 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://528bac05d1388e0a638d4e21540a1e20bd15f6ec7b5238ce57fc2bc9737af1ba" gracePeriod=600 Dec 05 20:33:00 crc kubenswrapper[4904]: I1205 20:33:00.723711 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="528bac05d1388e0a638d4e21540a1e20bd15f6ec7b5238ce57fc2bc9737af1ba" exitCode=0 Dec 05 20:33:00 crc kubenswrapper[4904]: I1205 20:33:00.723759 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"528bac05d1388e0a638d4e21540a1e20bd15f6ec7b5238ce57fc2bc9737af1ba"} Dec 05 20:33:00 crc kubenswrapper[4904]: I1205 20:33:00.723798 4904 scope.go:117] "RemoveContainer" containerID="3a6fee22a5899dc7491b94f9ec1bfdefcfd49d1d911077fc82ba25da5a725a66" Dec 05 20:33:05 crc kubenswrapper[4904]: E1205 20:33:05.832650 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 05 20:33:05 crc kubenswrapper[4904]: E1205 20:33:05.833079 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 05 20:33:05 crc kubenswrapper[4904]: E1205 20:33:05.833230 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.9:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hdvjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-gtfpr_openstack(5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:33:05 crc kubenswrapper[4904]: E1205 20:33:05.835212 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-gtfpr" podUID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" Dec 05 20:33:06 crc kubenswrapper[4904]: E1205 20:33:06.296281 4904 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Dec 05 20:33:06 crc kubenswrapper[4904]: E1205 20:33:06.296336 4904 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Dec 05 20:33:06 crc kubenswrapper[4904]: E1205 20:33:06.296439 4904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.9:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vgsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-k6l4d_openstack(b72d1aa8-3933-4153-89ac-a4ffe0667268): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:33:06 crc kubenswrapper[4904]: E1205 20:33:06.297739 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-k6l4d" podUID="b72d1aa8-3933-4153-89ac-a4ffe0667268" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.515299 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.531777 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669294 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-config-data\") pod \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669365 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-config\") pod \"76209519-a745-4eab-9d5d-c330ffb29191\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669393 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-combined-ca-bundle\") pod \"76209519-a745-4eab-9d5d-c330ffb29191\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669416 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxvs5\" (UniqueName: \"kubernetes.io/projected/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-kube-api-access-bxvs5\") pod \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669434 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-logs\") pod \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669495 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-scripts\") pod \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669528 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-horizon-secret-key\") pod \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\" (UID: \"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.669579 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb2fs\" (UniqueName: \"kubernetes.io/projected/76209519-a745-4eab-9d5d-c330ffb29191-kube-api-access-zb2fs\") pod \"76209519-a745-4eab-9d5d-c330ffb29191\" (UID: \"76209519-a745-4eab-9d5d-c330ffb29191\") " Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.670020 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-config-data" (OuterVolumeSpecName: "config-data") pod "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" (UID: "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.670211 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-logs" (OuterVolumeSpecName: "logs") pod "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" (UID: "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.670344 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-scripts" (OuterVolumeSpecName: "scripts") pod "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" (UID: "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.674741 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76209519-a745-4eab-9d5d-c330ffb29191-kube-api-access-zb2fs" (OuterVolumeSpecName: "kube-api-access-zb2fs") pod "76209519-a745-4eab-9d5d-c330ffb29191" (UID: "76209519-a745-4eab-9d5d-c330ffb29191"). InnerVolumeSpecName "kube-api-access-zb2fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.676932 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" (UID: "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.678200 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-kube-api-access-bxvs5" (OuterVolumeSpecName: "kube-api-access-bxvs5") pod "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" (UID: "561a2fa1-4349-438a-9bfd-8c8c6e3e3a73"). InnerVolumeSpecName "kube-api-access-bxvs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.711307 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76209519-a745-4eab-9d5d-c330ffb29191" (UID: "76209519-a745-4eab-9d5d-c330ffb29191"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.714051 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-config" (OuterVolumeSpecName: "config") pod "76209519-a745-4eab-9d5d-c330ffb29191" (UID: "76209519-a745-4eab-9d5d-c330ffb29191"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.770786 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxvs5\" (UniqueName: \"kubernetes.io/projected/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-kube-api-access-bxvs5\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.770819 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.770832 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.770984 4904 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.771355 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb2fs\" (UniqueName: \"kubernetes.io/projected/76209519-a745-4eab-9d5d-c330ffb29191-kube-api-access-zb2fs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.771370 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.771381 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.771413 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76209519-a745-4eab-9d5d-c330ffb29191-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.777526 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f6dfd9c-wt8l4" event={"ID":"561a2fa1-4349-438a-9bfd-8c8c6e3e3a73","Type":"ContainerDied","Data":"9486b448461577e0070f33bcb3c9e8c9507f8a8aa37aefdf2671dee5427ae881"} Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.777564 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f6dfd9c-wt8l4" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.791388 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pjlzn" event={"ID":"76209519-a745-4eab-9d5d-c330ffb29191","Type":"ContainerDied","Data":"5ef354e23bb8e738c9786ec28118bb8e970cd525422c2c94137866c097dfadfa"} Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.791437 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef354e23bb8e738c9786ec28118bb8e970cd525422c2c94137866c097dfadfa" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.791492 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pjlzn" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.800687 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api-log" containerID="cri-o://30e14b92445197a4e38e382b251c4ecbc6bcefaca264e118c6c56ba34e8733a5" gracePeriod=30 Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.801495 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" containerID="cri-o://cf07c1ff4f13d09142b75fdfc765208a644a4a1c894791c04780aa390503f481" gracePeriod=30 Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.801873 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"fa95b130-3627-4fa6-9bf8-1760d3b24843","Type":"ContainerStarted","Data":"cf07c1ff4f13d09142b75fdfc765208a644a4a1c894791c04780aa390503f481"} Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.801909 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.812171 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.837896 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=45.837875337 podStartE2EDuration="45.837875337s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:06.827706489 +0000 UTC m=+1285.638922598" watchObservedRunningTime="2025-12-05 20:33:06.837875337 +0000 UTC m=+1285.649091446" Dec 05 20:33:06 crc kubenswrapper[4904]: E1205 20:33:06.866539 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-k6l4d" podUID="b72d1aa8-3933-4153-89ac-a4ffe0667268" Dec 05 20:33:06 crc kubenswrapper[4904]: E1205 20:33:06.866539 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-gtfpr" podUID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.944013 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547f6dfd9c-wt8l4"] Dec 05 20:33:06 crc kubenswrapper[4904]: I1205 20:33:06.949406 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-547f6dfd9c-wt8l4"] Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.399014 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cf6d474f7-dfz4j"] Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.496086 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d867d46cb-9mdx2"] Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.555167 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.621593 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6867ddbddb-4lg6w"] Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.708240 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561a2fa1-4349-438a-9bfd-8c8c6e3e3a73" path="/var/lib/kubelet/pods/561a2fa1-4349-438a-9bfd-8c8c6e3e3a73/volumes" Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.708656 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ft95f"] Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.841243 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8698908a-a9e4-46c2-9855-2a1db92b1d75","Type":"ContainerStarted","Data":"accaae3332576ca7b7c4415c9ac96a9c946e1ca857e3b618ee95cd01e75033ca"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.863850 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" event={"ID":"659969bf-c700-4cb7-b8c0-1d30873b3d0a","Type":"ContainerStarted","Data":"0b16a59b78c0cb3995ab31d9f06c3ee2af2030d079dd00e41a7d7a0b863b72a9"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.864705 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.867748 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"3a95f83a14a35574f3a4cccf8ae06571e374650e21ab79c52d42113e0e6de1c7"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.893265 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8b46f59-njvnc"] Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.894266 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6867ddbddb-4lg6w" event={"ID":"499a09a1-c3aa-4fa2-95a8-0d2896a1d978","Type":"ContainerStarted","Data":"fda7bee0b7586a880bbf4c0b492e3a65af9c7f0568ea729b26f3fa2c9af5d5f4"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.894832 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=14.035423404 podStartE2EDuration="46.894812873s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="2025-12-05 20:32:23.005274942 +0000 UTC m=+1241.816491051" lastFinishedPulling="2025-12-05 20:32:55.864664411 +0000 UTC m=+1274.675880520" observedRunningTime="2025-12-05 20:33:07.890087928 +0000 UTC m=+1286.701304037" watchObservedRunningTime="2025-12-05 20:33:07.894812873 +0000 UTC m=+1286.706028982" Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.895878 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerStarted","Data":"5eb179fbfe30f07744b88d2fb3bdfd23b7f5a02ed5ac324ec2c9cfb9333fbaa8"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.910389 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ft95f" event={"ID":"e046612d-8016-4321-8a7a-a14b14f68e91","Type":"ContainerStarted","Data":"c41392b6b77c4337f8db83a7299d80bb99240dd817b6972b736f6bde36411047"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.970741 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cf6d474f7-dfz4j" event={"ID":"70255ecd-6213-48cd-bd22-e6e14bbc497d","Type":"ContainerStarted","Data":"f22feab61a5298c4ea9d05d764626891bdca6c9f729ed83ae55744fadc6122dc"} Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.982407 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c784f7484-wz74b"] Dec 05 20:33:07 crc kubenswrapper[4904]: E1205 20:33:07.983105 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76209519-a745-4eab-9d5d-c330ffb29191" containerName="neutron-db-sync" Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.983119 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="76209519-a745-4eab-9d5d-c330ffb29191" containerName="neutron-db-sync" Dec 05 20:33:07 crc kubenswrapper[4904]: I1205 20:33:07.983503 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="76209519-a745-4eab-9d5d-c330ffb29191" containerName="neutron-db-sync" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:07.997280 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c784f7484-wz74b"] Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:07.997395 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.000516 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.000728 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.000862 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bpllb" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.000891 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.029431 4904 generic.go:334] "Generic (PLEG): container finished" podID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerID="30e14b92445197a4e38e382b251c4ecbc6bcefaca264e118c6c56ba34e8733a5" exitCode=143 Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.029528 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"fa95b130-3627-4fa6-9bf8-1760d3b24843","Type":"ContainerDied","Data":"30e14b92445197a4e38e382b251c4ecbc6bcefaca264e118c6c56ba34e8733a5"} Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.035959 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7569b6c979-qznqm"] Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.038606 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.040463 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d","Type":"ContainerStarted","Data":"34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc"} Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.112619 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4zhc6" event={"ID":"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8","Type":"ContainerStarted","Data":"588bb01d56b63e882f961e9f854e18ef40bec7cc7f939dfb4704961b67fcc6f5"} Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.130116 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7569b6c979-qznqm"] Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.170874 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" podStartSLOduration=46.170852989 podStartE2EDuration="46.170852989s" podCreationTimestamp="2025-12-05 20:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:07.996967372 +0000 UTC m=+1286.808183481" watchObservedRunningTime="2025-12-05 20:33:08.170852989 +0000 UTC m=+1286.982069098" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171569 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-sb\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171636 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-combined-ca-bundle\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171698 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-config\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171765 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-config\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171799 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-httpd-config\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171853 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthv5\" (UniqueName: \"kubernetes.io/projected/310cb505-37fe-420a-8012-3840d7ede2f0-kube-api-access-xthv5\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171898 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-swift-storage-0\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171917 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-svc\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.171991 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8jf\" (UniqueName: \"kubernetes.io/projected/eabc2178-31e6-49e7-a78c-58dd769c751c-kube-api-access-5f8jf\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.172023 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-nb\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.172050 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-ovndb-tls-certs\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.207716 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d867d46cb-9mdx2" event={"ID":"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614","Type":"ContainerStarted","Data":"3b4d4eabdf3ae4ff5884e6c92cd1c256bb52fecb6ff7c9f1e999a46be594c71b"} Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277177 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-config\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277258 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-config\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277285 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-httpd-config\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277336 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthv5\" (UniqueName: \"kubernetes.io/projected/310cb505-37fe-420a-8012-3840d7ede2f0-kube-api-access-xthv5\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277392 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-swift-storage-0\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277407 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-svc\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277472 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8jf\" (UniqueName: \"kubernetes.io/projected/eabc2178-31e6-49e7-a78c-58dd769c751c-kube-api-access-5f8jf\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277494 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-nb\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277515 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-ovndb-tls-certs\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277559 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-sb\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.277588 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-combined-ca-bundle\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.281930 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-swift-storage-0\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.282949 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-svc\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.303804 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-nb\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.304389 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-sb\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.307552 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=14.395414539 podStartE2EDuration="47.307536338s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="2025-12-05 20:32:22.976133654 +0000 UTC m=+1241.787349763" lastFinishedPulling="2025-12-05 20:32:55.888255463 +0000 UTC m=+1274.699471562" observedRunningTime="2025-12-05 20:33:08.260674704 +0000 UTC m=+1287.071890813" watchObservedRunningTime="2025-12-05 20:33:08.307536338 +0000 UTC m=+1287.118752437" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.323526 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4zhc6" podStartSLOduration=4.749502167 podStartE2EDuration="47.323508709s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="2025-12-05 20:32:23.757600107 +0000 UTC m=+1242.568816216" lastFinishedPulling="2025-12-05 20:33:06.331606649 +0000 UTC m=+1285.142822758" observedRunningTime="2025-12-05 20:33:08.300614275 +0000 UTC m=+1287.111830384" watchObservedRunningTime="2025-12-05 20:33:08.323508709 +0000 UTC m=+1287.134724808" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.325588 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-config\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.346239 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8jf\" (UniqueName: \"kubernetes.io/projected/eabc2178-31e6-49e7-a78c-58dd769c751c-kube-api-access-5f8jf\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.349255 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-httpd-config\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.350094 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-ovndb-tls-certs\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.352938 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-combined-ca-bundle\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.352986 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-config\") pod \"neutron-5c784f7484-wz74b\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.354037 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthv5\" (UniqueName: \"kubernetes.io/projected/310cb505-37fe-420a-8012-3840d7ede2f0-kube-api-access-xthv5\") pod \"dnsmasq-dns-7569b6c979-qznqm\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.538277 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:08 crc kubenswrapper[4904]: I1205 20:33:08.584634 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.254110 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cf6d474f7-dfz4j" event={"ID":"70255ecd-6213-48cd-bd22-e6e14bbc497d","Type":"ContainerStarted","Data":"416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308"} Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.254533 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cf6d474f7-dfz4j" event={"ID":"70255ecd-6213-48cd-bd22-e6e14bbc497d","Type":"ContainerStarted","Data":"99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f"} Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.254675 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cf6d474f7-dfz4j" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon-log" containerID="cri-o://99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f" gracePeriod=30 Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.255025 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cf6d474f7-dfz4j" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon" containerID="cri-o://416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308" gracePeriod=30 Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.261255 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d867d46cb-9mdx2" event={"ID":"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614","Type":"ContainerStarted","Data":"1434762937f48cccc85ac2bc0f569307583a016219d42f8c529d614543df2061"} Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.261297 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d867d46cb-9mdx2" event={"ID":"ffe3f1f4-8d49-4bf8-a088-e3a930ddc614","Type":"ContainerStarted","Data":"6bda65469371483c3567754c20d26b8e511064ce9d8f8cf3500f059f634a1dbc"} Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.263717 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6867ddbddb-4lg6w" event={"ID":"499a09a1-c3aa-4fa2-95a8-0d2896a1d978","Type":"ContainerStarted","Data":"0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323"} Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.266259 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ft95f" event={"ID":"e046612d-8016-4321-8a7a-a14b14f68e91","Type":"ContainerStarted","Data":"dc978a68af7766b44df471d6f7e4d8f475abe5d11585e4c2d2e4b6ab1839a094"} Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.279636 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-cf6d474f7-dfz4j" podStartSLOduration=46.198312979 podStartE2EDuration="46.279610529s" podCreationTimestamp="2025-12-05 20:32:23 +0000 UTC" firstStartedPulling="2025-12-05 20:33:07.388872543 +0000 UTC m=+1286.200088652" lastFinishedPulling="2025-12-05 20:33:07.470170093 +0000 UTC m=+1286.281386202" observedRunningTime="2025-12-05 20:33:09.275727327 +0000 UTC m=+1288.086943436" watchObservedRunningTime="2025-12-05 20:33:09.279610529 +0000 UTC m=+1288.090826638" Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.323985 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ft95f" podStartSLOduration=12.323959917 podStartE2EDuration="12.323959917s" podCreationTimestamp="2025-12-05 20:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:09.293078034 +0000 UTC m=+1288.104294143" watchObservedRunningTime="2025-12-05 20:33:09.323959917 +0000 UTC m=+1288.135176026" Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.339046 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-d867d46cb-9mdx2" podStartSLOduration=39.338926431 podStartE2EDuration="39.338926431s" podCreationTimestamp="2025-12-05 20:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:09.322442297 +0000 UTC m=+1288.133658416" watchObservedRunningTime="2025-12-05 20:33:09.338926431 +0000 UTC m=+1288.150142540" Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.650325 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7569b6c979-qznqm"] Dec 05 20:33:09 crc kubenswrapper[4904]: W1205 20:33:09.660412 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310cb505_37fe_420a_8012_3840d7ede2f0.slice/crio-07828654c6248622c37291eeaa395a4a36b155f1ff8dc4766b0cea1724a4bb83 WatchSource:0}: Error finding container 07828654c6248622c37291eeaa395a4a36b155f1ff8dc4766b0cea1724a4bb83: Status 404 returned error can't find the container with id 07828654c6248622c37291eeaa395a4a36b155f1ff8dc4766b0cea1724a4bb83 Dec 05 20:33:09 crc kubenswrapper[4904]: I1205 20:33:09.890659 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c784f7484-wz74b"] Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.294044 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6867ddbddb-4lg6w" event={"ID":"499a09a1-c3aa-4fa2-95a8-0d2896a1d978","Type":"ContainerStarted","Data":"a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36"} Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.298646 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" event={"ID":"310cb505-37fe-420a-8012-3840d7ede2f0","Type":"ContainerStarted","Data":"50febebccef027b0b1e35cf9c55da24f1d28e1776eb602871a84d06d6caaad6a"} Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.298696 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" event={"ID":"310cb505-37fe-420a-8012-3840d7ede2f0","Type":"ContainerStarted","Data":"07828654c6248622c37291eeaa395a4a36b155f1ff8dc4766b0cea1724a4bb83"} Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.302149 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c784f7484-wz74b" event={"ID":"eabc2178-31e6-49e7-a78c-58dd769c751c","Type":"ContainerStarted","Data":"89264106edf16cb018b2c36cf1c825aba7bc4cbe6a25a2bdbb581a994d75b093"} Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.302715 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerName="dnsmasq-dns" containerID="cri-o://0b16a59b78c0cb3995ab31d9f06c3ee2af2030d079dd00e41a7d7a0b863b72a9" gracePeriod=10 Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.365990 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6867ddbddb-4lg6w" podStartSLOduration=40.36597197 podStartE2EDuration="40.36597197s" podCreationTimestamp="2025-12-05 20:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:10.314846433 +0000 UTC m=+1289.126062542" watchObservedRunningTime="2025-12-05 20:33:10.36597197 +0000 UTC m=+1289.177188079" Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.725370 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.725415 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.917741 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:33:10 crc kubenswrapper[4904]: I1205 20:33:10.917793 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:33:11 crc kubenswrapper[4904]: E1205 20:33:11.031484 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310cb505_37fe_420a_8012_3840d7ede2f0.slice/crio-conmon-50febebccef027b0b1e35cf9c55da24f1d28e1776eb602871a84d06d6caaad6a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod659969bf_c700_4cb7_b8c0_1d30873b3d0a.slice/crio-conmon-0b16a59b78c0cb3995ab31d9f06c3ee2af2030d079dd00e41a7d7a0b863b72a9.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.321310 4904 generic.go:334] "Generic (PLEG): container finished" podID="310cb505-37fe-420a-8012-3840d7ede2f0" containerID="50febebccef027b0b1e35cf9c55da24f1d28e1776eb602871a84d06d6caaad6a" exitCode=0 Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.321592 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" event={"ID":"310cb505-37fe-420a-8012-3840d7ede2f0","Type":"ContainerDied","Data":"50febebccef027b0b1e35cf9c55da24f1d28e1776eb602871a84d06d6caaad6a"} Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.335531 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c784f7484-wz74b" event={"ID":"eabc2178-31e6-49e7-a78c-58dd769c751c","Type":"ContainerStarted","Data":"249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea"} Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.369424 4904 generic.go:334] "Generic (PLEG): container finished" podID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerID="0b16a59b78c0cb3995ab31d9f06c3ee2af2030d079dd00e41a7d7a0b863b72a9" exitCode=0 Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.371541 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" event={"ID":"659969bf-c700-4cb7-b8c0-1d30873b3d0a","Type":"ContainerDied","Data":"0b16a59b78c0cb3995ab31d9f06c3ee2af2030d079dd00e41a7d7a0b863b72a9"} Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.572110 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78cf6bb7c7-zmmjl"] Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.573656 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.578612 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.578833 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.613387 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78cf6bb7c7-zmmjl"] Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.676557 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-public-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.676819 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-httpd-config\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.676843 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-ovndb-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.676895 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxxhc\" (UniqueName: \"kubernetes.io/projected/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-kube-api-access-wxxhc\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.676928 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-internal-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.676950 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-config\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.677016 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-combined-ca-bundle\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781031 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-httpd-config\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781089 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-ovndb-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781142 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxxhc\" (UniqueName: \"kubernetes.io/projected/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-kube-api-access-wxxhc\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781189 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-internal-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781210 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-config\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781263 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-combined-ca-bundle\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.781292 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-public-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.798910 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-internal-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.803892 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-config\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.804351 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-public-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.810380 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-httpd-config\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.811824 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-combined-ca-bundle\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.813663 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-ovndb-tls-certs\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.829749 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxxhc\" (UniqueName: \"kubernetes.io/projected/9f1be3e8-cbd0-45cb-acb6-cf56217cec07-kube-api-access-wxxhc\") pod \"neutron-78cf6bb7c7-zmmjl\" (UID: \"9f1be3e8-cbd0-45cb-acb6-cf56217cec07\") " pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.912308 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.975189 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.989152 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-nb\") pod \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.989236 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzkf8\" (UniqueName: \"kubernetes.io/projected/659969bf-c700-4cb7-b8c0-1d30873b3d0a-kube-api-access-rzkf8\") pod \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.989278 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-svc\") pod \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.989310 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-config\") pod \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.989428 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-sb\") pod \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " Dec 05 20:33:11 crc kubenswrapper[4904]: I1205 20:33:11.989462 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-swift-storage-0\") pod \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\" (UID: \"659969bf-c700-4cb7-b8c0-1d30873b3d0a\") " Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.019270 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659969bf-c700-4cb7-b8c0-1d30873b3d0a-kube-api-access-rzkf8" (OuterVolumeSpecName: "kube-api-access-rzkf8") pod "659969bf-c700-4cb7-b8c0-1d30873b3d0a" (UID: "659969bf-c700-4cb7-b8c0-1d30873b3d0a"). InnerVolumeSpecName "kube-api-access-rzkf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.079880 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-config" (OuterVolumeSpecName: "config") pod "659969bf-c700-4cb7-b8c0-1d30873b3d0a" (UID: "659969bf-c700-4cb7-b8c0-1d30873b3d0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.087692 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.087732 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.091192 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzkf8\" (UniqueName: \"kubernetes.io/projected/659969bf-c700-4cb7-b8c0-1d30873b3d0a-kube-api-access-rzkf8\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.097557 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "659969bf-c700-4cb7-b8c0-1d30873b3d0a" (UID: "659969bf-c700-4cb7-b8c0-1d30873b3d0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.098100 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.105594 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "659969bf-c700-4cb7-b8c0-1d30873b3d0a" (UID: "659969bf-c700-4cb7-b8c0-1d30873b3d0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.140702 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.164211 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "659969bf-c700-4cb7-b8c0-1d30873b3d0a" (UID: "659969bf-c700-4cb7-b8c0-1d30873b3d0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.187295 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "659969bf-c700-4cb7-b8c0-1d30873b3d0a" (UID: "659969bf-c700-4cb7-b8c0-1d30873b3d0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.197287 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.240353 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.240394 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.240408 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.240425 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/659969bf-c700-4cb7-b8c0-1d30873b3d0a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.262820 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.412783 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" event={"ID":"659969bf-c700-4cb7-b8c0-1d30873b3d0a","Type":"ContainerDied","Data":"8e73b9edb0438fc9074d15932e1ee6c8c33162cc689ea50a15cd18c5de6b6606"} Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.412842 4904 scope.go:117] "RemoveContainer" containerID="0b16a59b78c0cb3995ab31d9f06c3ee2af2030d079dd00e41a7d7a0b863b72a9" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.413147 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8b46f59-njvnc" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.429378 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wh6jz" event={"ID":"f0a41473-f3de-440f-89be-9fddf77f6148","Type":"ContainerStarted","Data":"39156239472b92dabca22f1a6012db800be8051e59e0bc036717ecd11904ceab"} Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.456578 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wh6jz" podStartSLOduration=5.372847237 podStartE2EDuration="51.456561047s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="2025-12-05 20:32:23.766947143 +0000 UTC m=+1242.578163252" lastFinishedPulling="2025-12-05 20:33:09.850660953 +0000 UTC m=+1288.661877062" observedRunningTime="2025-12-05 20:33:12.453810345 +0000 UTC m=+1291.265026454" watchObservedRunningTime="2025-12-05 20:33:12.456561047 +0000 UTC m=+1291.267777156" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.477337 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" event={"ID":"310cb505-37fe-420a-8012-3840d7ede2f0","Type":"ContainerStarted","Data":"2659f9b6fae95519af45cb197d6d99eace636a1728e5add817cc8abfe59b6322"} Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.477548 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.489291 4904 scope.go:117] "RemoveContainer" containerID="5d160d2fc4e3ad62f171aeb938e7c2b454212b691353f4e89466b0ed5237d53a" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.497797 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c784f7484-wz74b" event={"ID":"eabc2178-31e6-49e7-a78c-58dd769c751c","Type":"ContainerStarted","Data":"b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9"} Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.497837 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8b46f59-njvnc"] Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.497855 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.498707 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.505612 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8b46f59-njvnc"] Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.515763 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" podStartSLOduration=5.515744095 podStartE2EDuration="5.515744095s" podCreationTimestamp="2025-12-05 20:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:12.507648162 +0000 UTC m=+1291.318864271" watchObservedRunningTime="2025-12-05 20:33:12.515744095 +0000 UTC m=+1291.326960204" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.577019 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.599397 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.614496 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.617938 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c784f7484-wz74b" podStartSLOduration=5.617919185 podStartE2EDuration="5.617919185s" podCreationTimestamp="2025-12-05 20:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:12.533602866 +0000 UTC m=+1291.344818975" watchObservedRunningTime="2025-12-05 20:33:12.617919185 +0000 UTC m=+1291.429135294" Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.653915 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.699217 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:33:12 crc kubenswrapper[4904]: I1205 20:33:12.710714 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78cf6bb7c7-zmmjl"] Dec 05 20:33:13 crc kubenswrapper[4904]: I1205 20:33:13.519170 4904 generic.go:334] "Generic (PLEG): container finished" podID="2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" containerID="588bb01d56b63e882f961e9f854e18ef40bec7cc7f939dfb4704961b67fcc6f5" exitCode=0 Dec 05 20:33:13 crc kubenswrapper[4904]: I1205 20:33:13.519528 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4zhc6" event={"ID":"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8","Type":"ContainerDied","Data":"588bb01d56b63e882f961e9f854e18ef40bec7cc7f939dfb4704961b67fcc6f5"} Dec 05 20:33:13 crc kubenswrapper[4904]: I1205 20:33:13.540576 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78cf6bb7c7-zmmjl" event={"ID":"9f1be3e8-cbd0-45cb-acb6-cf56217cec07","Type":"ContainerStarted","Data":"dae15546117d82e32889e11a8dbe7d54fae22fce767edd6ecf87ab7d699b9430"} Dec 05 20:33:13 crc kubenswrapper[4904]: I1205 20:33:13.540609 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78cf6bb7c7-zmmjl" event={"ID":"9f1be3e8-cbd0-45cb-acb6-cf56217cec07","Type":"ContainerStarted","Data":"3b6499af7b72b0da82a5e5fa9dbd3a9ca0175e0bc65e3003d1ef2b52c3e265ee"} Dec 05 20:33:13 crc kubenswrapper[4904]: I1205 20:33:13.730722 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" path="/var/lib/kubelet/pods/659969bf-c700-4cb7-b8c0-1d30873b3d0a/volumes" Dec 05 20:33:14 crc kubenswrapper[4904]: I1205 20:33:14.469649 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:33:14 crc kubenswrapper[4904]: I1205 20:33:14.559190 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78cf6bb7c7-zmmjl" event={"ID":"9f1be3e8-cbd0-45cb-acb6-cf56217cec07","Type":"ContainerStarted","Data":"c4b663da88db523e4424f2a46cf13e2b09c06db732569501a5e9b19c15a1a380"} Dec 05 20:33:14 crc kubenswrapper[4904]: I1205 20:33:14.559441 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" containerName="watcher-applier" containerID="cri-o://34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc" gracePeriod=30 Dec 05 20:33:14 crc kubenswrapper[4904]: I1205 20:33:14.559517 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:14 crc kubenswrapper[4904]: I1205 20:33:14.559579 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="8698908a-a9e4-46c2-9855-2a1db92b1d75" containerName="watcher-decision-engine" containerID="cri-o://accaae3332576ca7b7c4415c9ac96a9c946e1ca857e3b618ee95cd01e75033ca" gracePeriod=30 Dec 05 20:33:14 crc kubenswrapper[4904]: I1205 20:33:14.580944 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78cf6bb7c7-zmmjl" podStartSLOduration=3.580902203 podStartE2EDuration="3.580902203s" podCreationTimestamp="2025-12-05 20:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:14.579921757 +0000 UTC m=+1293.391137866" watchObservedRunningTime="2025-12-05 20:33:14.580902203 +0000 UTC m=+1293.392118312" Dec 05 20:33:15 crc kubenswrapper[4904]: I1205 20:33:15.965883 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 20:33:16 crc kubenswrapper[4904]: I1205 20:33:16.594191 4904 generic.go:334] "Generic (PLEG): container finished" podID="e046612d-8016-4321-8a7a-a14b14f68e91" containerID="dc978a68af7766b44df471d6f7e4d8f475abe5d11585e4c2d2e4b6ab1839a094" exitCode=0 Dec 05 20:33:16 crc kubenswrapper[4904]: I1205 20:33:16.594280 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ft95f" event={"ID":"e046612d-8016-4321-8a7a-a14b14f68e91","Type":"ContainerDied","Data":"dc978a68af7766b44df471d6f7e4d8f475abe5d11585e4c2d2e4b6ab1839a094"} Dec 05 20:33:16 crc kubenswrapper[4904]: I1205 20:33:16.596388 4904 generic.go:334] "Generic (PLEG): container finished" podID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" containerID="34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc" exitCode=0 Dec 05 20:33:16 crc kubenswrapper[4904]: I1205 20:33:16.596451 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d","Type":"ContainerDied","Data":"34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc"} Dec 05 20:33:16 crc kubenswrapper[4904]: I1205 20:33:16.598272 4904 generic.go:334] "Generic (PLEG): container finished" podID="8698908a-a9e4-46c2-9855-2a1db92b1d75" containerID="accaae3332576ca7b7c4415c9ac96a9c946e1ca857e3b618ee95cd01e75033ca" exitCode=1 Dec 05 20:33:16 crc kubenswrapper[4904]: I1205 20:33:16.598355 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8698908a-a9e4-46c2-9855-2a1db92b1d75","Type":"ContainerDied","Data":"accaae3332576ca7b7c4415c9ac96a9c946e1ca857e3b618ee95cd01e75033ca"} Dec 05 20:33:17 crc kubenswrapper[4904]: E1205 20:33:17.088827 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc is running failed: container process not found" containerID="34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:33:17 crc kubenswrapper[4904]: E1205 20:33:17.089760 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc is running failed: container process not found" containerID="34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:33:17 crc kubenswrapper[4904]: E1205 20:33:17.089971 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc is running failed: container process not found" containerID="34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:33:17 crc kubenswrapper[4904]: E1205 20:33:17.089991 4904 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" containerName="watcher-applier" Dec 05 20:33:18 crc kubenswrapper[4904]: I1205 20:33:18.586696 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:18 crc kubenswrapper[4904]: I1205 20:33:18.689169 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54498d5797-97l4v"] Dec 05 20:33:18 crc kubenswrapper[4904]: I1205 20:33:18.690165 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54498d5797-97l4v" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerName="dnsmasq-dns" containerID="cri-o://063c158c88ef755b139fba63fbb815a2f18013203e130151b85cfe74fa9a296c" gracePeriod=10 Dec 05 20:33:19 crc kubenswrapper[4904]: I1205 20:33:19.628665 4904 generic.go:334] "Generic (PLEG): container finished" podID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerID="063c158c88ef755b139fba63fbb815a2f18013203e130151b85cfe74fa9a296c" exitCode=0 Dec 05 20:33:19 crc kubenswrapper[4904]: I1205 20:33:19.628708 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54498d5797-97l4v" event={"ID":"90af75ae-3389-4c14-a59d-eaaeee34e58f","Type":"ContainerDied","Data":"063c158c88ef755b139fba63fbb815a2f18013203e130151b85cfe74fa9a296c"} Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.275245 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.289562 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.294844 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4zhc6" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337714 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-combined-ca-bundle\") pod \"e046612d-8016-4321-8a7a-a14b14f68e91\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337793 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kb25\" (UniqueName: \"kubernetes.io/projected/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-kube-api-access-9kb25\") pod \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337835 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-combined-ca-bundle\") pod \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337859 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jkl8\" (UniqueName: \"kubernetes.io/projected/e046612d-8016-4321-8a7a-a14b14f68e91-kube-api-access-8jkl8\") pod \"e046612d-8016-4321-8a7a-a14b14f68e91\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337881 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-credential-keys\") pod \"e046612d-8016-4321-8a7a-a14b14f68e91\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337906 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-logs\") pod \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337920 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-config-data\") pod \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337952 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-scripts\") pod \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.337971 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-config-data\") pod \"e046612d-8016-4321-8a7a-a14b14f68e91\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.338005 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbb45\" (UniqueName: \"kubernetes.io/projected/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-kube-api-access-mbb45\") pod \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.338026 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-logs\") pod \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.338087 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-combined-ca-bundle\") pod \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\" (UID: \"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.338179 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-scripts\") pod \"e046612d-8016-4321-8a7a-a14b14f68e91\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.338210 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-fernet-keys\") pod \"e046612d-8016-4321-8a7a-a14b14f68e91\" (UID: \"e046612d-8016-4321-8a7a-a14b14f68e91\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.338236 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-config-data\") pod \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\" (UID: \"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.349164 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-logs" (OuterVolumeSpecName: "logs") pod "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" (UID: "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.368973 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-scripts" (OuterVolumeSpecName: "scripts") pod "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" (UID: "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.369072 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-logs" (OuterVolumeSpecName: "logs") pod "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" (UID: "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.369380 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-kube-api-access-mbb45" (OuterVolumeSpecName: "kube-api-access-mbb45") pod "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" (UID: "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d"). InnerVolumeSpecName "kube-api-access-mbb45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.376010 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e046612d-8016-4321-8a7a-a14b14f68e91" (UID: "e046612d-8016-4321-8a7a-a14b14f68e91"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.379222 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-scripts" (OuterVolumeSpecName: "scripts") pod "e046612d-8016-4321-8a7a-a14b14f68e91" (UID: "e046612d-8016-4321-8a7a-a14b14f68e91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.388628 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-kube-api-access-9kb25" (OuterVolumeSpecName: "kube-api-access-9kb25") pod "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" (UID: "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8"). InnerVolumeSpecName "kube-api-access-9kb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.388895 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.400608 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e046612d-8016-4321-8a7a-a14b14f68e91-kube-api-access-8jkl8" (OuterVolumeSpecName: "kube-api-access-8jkl8") pod "e046612d-8016-4321-8a7a-a14b14f68e91" (UID: "e046612d-8016-4321-8a7a-a14b14f68e91"). InnerVolumeSpecName "kube-api-access-8jkl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.407468 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e046612d-8016-4321-8a7a-a14b14f68e91" (UID: "e046612d-8016-4321-8a7a-a14b14f68e91"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.439544 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-svc\") pod \"90af75ae-3389-4c14-a59d-eaaeee34e58f\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.439808 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-swift-storage-0\") pod \"90af75ae-3389-4c14-a59d-eaaeee34e58f\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.439982 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-sb\") pod \"90af75ae-3389-4c14-a59d-eaaeee34e58f\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.440178 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-nb\") pod \"90af75ae-3389-4c14-a59d-eaaeee34e58f\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.440317 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-config\") pod \"90af75ae-3389-4c14-a59d-eaaeee34e58f\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.440515 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5kgf\" (UniqueName: \"kubernetes.io/projected/90af75ae-3389-4c14-a59d-eaaeee34e58f-kube-api-access-q5kgf\") pod \"90af75ae-3389-4c14-a59d-eaaeee34e58f\" (UID: \"90af75ae-3389-4c14-a59d-eaaeee34e58f\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441186 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441290 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441366 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbb45\" (UniqueName: \"kubernetes.io/projected/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-kube-api-access-mbb45\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441439 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441519 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441598 4904 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441674 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kb25\" (UniqueName: \"kubernetes.io/projected/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-kube-api-access-9kb25\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441747 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jkl8\" (UniqueName: \"kubernetes.io/projected/e046612d-8016-4321-8a7a-a14b14f68e91-kube-api-access-8jkl8\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.441819 4904 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.480402 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" (UID: "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.496571 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90af75ae-3389-4c14-a59d-eaaeee34e58f-kube-api-access-q5kgf" (OuterVolumeSpecName: "kube-api-access-q5kgf") pod "90af75ae-3389-4c14-a59d-eaaeee34e58f" (UID: "90af75ae-3389-4c14-a59d-eaaeee34e58f"). InnerVolumeSpecName "kube-api-access-q5kgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.497635 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.544408 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-custom-prometheus-ca\") pod \"8698908a-a9e4-46c2-9855-2a1db92b1d75\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.544471 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-combined-ca-bundle\") pod \"8698908a-a9e4-46c2-9855-2a1db92b1d75\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.544516 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-528kg\" (UniqueName: \"kubernetes.io/projected/8698908a-a9e4-46c2-9855-2a1db92b1d75-kube-api-access-528kg\") pod \"8698908a-a9e4-46c2-9855-2a1db92b1d75\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.544594 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-config-data\") pod \"8698908a-a9e4-46c2-9855-2a1db92b1d75\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.544822 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8698908a-a9e4-46c2-9855-2a1db92b1d75-logs\") pod \"8698908a-a9e4-46c2-9855-2a1db92b1d75\" (UID: \"8698908a-a9e4-46c2-9855-2a1db92b1d75\") " Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.545186 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.545198 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5kgf\" (UniqueName: \"kubernetes.io/projected/90af75ae-3389-4c14-a59d-eaaeee34e58f-kube-api-access-q5kgf\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.545814 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8698908a-a9e4-46c2-9855-2a1db92b1d75-logs" (OuterVolumeSpecName: "logs") pod "8698908a-a9e4-46c2-9855-2a1db92b1d75" (UID: "8698908a-a9e4-46c2-9855-2a1db92b1d75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.555930 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8698908a-a9e4-46c2-9855-2a1db92b1d75-kube-api-access-528kg" (OuterVolumeSpecName: "kube-api-access-528kg") pod "8698908a-a9e4-46c2-9855-2a1db92b1d75" (UID: "8698908a-a9e4-46c2-9855-2a1db92b1d75"). InnerVolumeSpecName "kube-api-access-528kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.561326 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-config-data" (OuterVolumeSpecName: "config-data") pod "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" (UID: "8acb38c5-f0b5-46e9-8cb7-085d0b12b16d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.647764 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-528kg\" (UniqueName: \"kubernetes.io/projected/8698908a-a9e4-46c2-9855-2a1db92b1d75-kube-api-access-528kg\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.647810 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.647822 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8698908a-a9e4-46c2-9855-2a1db92b1d75-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.670268 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4zhc6" event={"ID":"2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8","Type":"ContainerDied","Data":"ac7058d6f6ed2cd1df84046a58a1e467405a1effc8e5b44b13a995d1f92afb6f"} Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.670312 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7058d6f6ed2cd1df84046a58a1e467405a1effc8e5b44b13a995d1f92afb6f" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.670401 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4zhc6" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.694539 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.694840 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8698908a-a9e4-46c2-9855-2a1db92b1d75","Type":"ContainerDied","Data":"5407cf6346261137ed512d6574446e10d30922554bdeb542fd03dfc9d3a200ce"} Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.694891 4904 scope.go:117] "RemoveContainer" containerID="accaae3332576ca7b7c4415c9ac96a9c946e1ca857e3b618ee95cd01e75033ca" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.695298 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" (UID: "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.706621 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54498d5797-97l4v" event={"ID":"90af75ae-3389-4c14-a59d-eaaeee34e58f","Type":"ContainerDied","Data":"35b85f2fd588b4e9a2793141145513387c22ef9d1c186ed537d131aad8d2aae1"} Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.706885 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54498d5797-97l4v" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.712076 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e046612d-8016-4321-8a7a-a14b14f68e91" (UID: "e046612d-8016-4321-8a7a-a14b14f68e91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.714751 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ft95f" event={"ID":"e046612d-8016-4321-8a7a-a14b14f68e91","Type":"ContainerDied","Data":"c41392b6b77c4337f8db83a7299d80bb99240dd817b6972b736f6bde36411047"} Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.714809 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c41392b6b77c4337f8db83a7299d80bb99240dd817b6972b736f6bde36411047" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.714943 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ft95f" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.731121 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6867ddbddb-4lg6w" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.736107 4904 scope.go:117] "RemoveContainer" containerID="063c158c88ef755b139fba63fbb815a2f18013203e130151b85cfe74fa9a296c" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.736382 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8acb38c5-f0b5-46e9-8cb7-085d0b12b16d","Type":"ContainerDied","Data":"9f98c081d54208eccea9bcd76c5e464e485f61a0fe1bf4d3d431028bd6ef1e5f"} Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.736470 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.771849 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.771881 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.779398 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.794792 4904 scope.go:117] "RemoveContainer" containerID="eca5bb8b29bf1e606bc87c6928a3ad4ac7f410f8c08fe1946df761c22fd6da30" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.795005 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-config-data" (OuterVolumeSpecName: "config-data") pod "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" (UID: "2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.801830 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-config-data" (OuterVolumeSpecName: "config-data") pod "e046612d-8016-4321-8a7a-a14b14f68e91" (UID: "e046612d-8016-4321-8a7a-a14b14f68e91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.805757 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836306 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836716 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerName="dnsmasq-dns" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836729 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerName="dnsmasq-dns" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836741 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e046612d-8016-4321-8a7a-a14b14f68e91" containerName="keystone-bootstrap" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836747 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e046612d-8016-4321-8a7a-a14b14f68e91" containerName="keystone-bootstrap" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836760 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerName="init" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836766 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerName="init" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836782 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerName="dnsmasq-dns" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836787 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerName="dnsmasq-dns" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836825 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerName="init" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836831 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerName="init" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836838 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" containerName="watcher-applier" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836845 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" containerName="watcher-applier" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836859 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8698908a-a9e4-46c2-9855-2a1db92b1d75" containerName="watcher-decision-engine" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836864 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8698908a-a9e4-46c2-9855-2a1db92b1d75" containerName="watcher-decision-engine" Dec 05 20:33:20 crc kubenswrapper[4904]: E1205 20:33:20.836876 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" containerName="placement-db-sync" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.836881 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" containerName="placement-db-sync" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837079 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8698908a-a9e4-46c2-9855-2a1db92b1d75" containerName="watcher-decision-engine" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837100 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" containerName="dnsmasq-dns" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837113 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" containerName="placement-db-sync" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837122 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e046612d-8016-4321-8a7a-a14b14f68e91" containerName="keystone-bootstrap" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837129 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" containerName="watcher-applier" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837136 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="659969bf-c700-4cb7-b8c0-1d30873b3d0a" containerName="dnsmasq-dns" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.837753 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.839725 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.841990 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8698908a-a9e4-46c2-9855-2a1db92b1d75" (UID: "8698908a-a9e4-46c2-9855-2a1db92b1d75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.846780 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.871357 4904 scope.go:117] "RemoveContainer" containerID="34108de7470cb550f2341e050a9134e537bba3cf0a18d7e71edad2f8834b50bc" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.873135 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.873257 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.873348 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e046612d-8016-4321-8a7a-a14b14f68e91-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:20 crc kubenswrapper[4904]: I1205 20:33:20.918904 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d867d46cb-9mdx2" podUID="ffe3f1f4-8d49-4bf8-a088-e3a930ddc614" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:20.984433 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzq9\" (UniqueName: \"kubernetes.io/projected/48f600c5-7f88-4c27-a152-d79354212532-kube-api-access-kxzq9\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:20.984565 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f600c5-7f88-4c27-a152-d79354212532-config-data\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:20.984616 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f600c5-7f88-4c27-a152-d79354212532-logs\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:20.984652 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f600c5-7f88-4c27-a152-d79354212532-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.024923 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8698908a-a9e4-46c2-9855-2a1db92b1d75" (UID: "8698908a-a9e4-46c2-9855-2a1db92b1d75"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.029353 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90af75ae-3389-4c14-a59d-eaaeee34e58f" (UID: "90af75ae-3389-4c14-a59d-eaaeee34e58f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.038166 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-config-data" (OuterVolumeSpecName: "config-data") pod "8698908a-a9e4-46c2-9855-2a1db92b1d75" (UID: "8698908a-a9e4-46c2-9855-2a1db92b1d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.047625 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90af75ae-3389-4c14-a59d-eaaeee34e58f" (UID: "90af75ae-3389-4c14-a59d-eaaeee34e58f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.064498 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-config" (OuterVolumeSpecName: "config") pod "90af75ae-3389-4c14-a59d-eaaeee34e58f" (UID: "90af75ae-3389-4c14-a59d-eaaeee34e58f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.066758 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90af75ae-3389-4c14-a59d-eaaeee34e58f" (UID: "90af75ae-3389-4c14-a59d-eaaeee34e58f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.072911 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90af75ae-3389-4c14-a59d-eaaeee34e58f" (UID: "90af75ae-3389-4c14-a59d-eaaeee34e58f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086257 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f600c5-7f88-4c27-a152-d79354212532-config-data\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086359 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f600c5-7f88-4c27-a152-d79354212532-logs\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086475 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f600c5-7f88-4c27-a152-d79354212532-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086546 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzq9\" (UniqueName: \"kubernetes.io/projected/48f600c5-7f88-4c27-a152-d79354212532-kube-api-access-kxzq9\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086698 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086713 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086725 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086761 4904 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086775 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8698908a-a9e4-46c2-9855-2a1db92b1d75-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086785 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.086796 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90af75ae-3389-4c14-a59d-eaaeee34e58f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.089672 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f600c5-7f88-4c27-a152-d79354212532-logs\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.092379 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f600c5-7f88-4c27-a152-d79354212532-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.096693 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f600c5-7f88-4c27-a152-d79354212532-config-data\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.105927 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzq9\" (UniqueName: \"kubernetes.io/projected/48f600c5-7f88-4c27-a152-d79354212532-kube-api-access-kxzq9\") pod \"watcher-applier-0\" (UID: \"48f600c5-7f88-4c27-a152-d79354212532\") " pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.155111 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.403155 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54498d5797-97l4v"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.420239 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54498d5797-97l4v"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.426138 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.439334 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.449091 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-64fdbdb744-gdc8l"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.450744 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.453585 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.456712 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.456941 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ddf7h" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.457073 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.457178 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.457275 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.457753 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-64fdbdb744-gdc8l"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.467034 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.468475 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.477151 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.482416 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492600 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-fernet-keys\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492641 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492670 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-internal-tls-certs\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492687 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzsn\" (UniqueName: \"kubernetes.io/projected/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-kube-api-access-7tzsn\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492703 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492720 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492752 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-config-data\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492766 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-credential-keys\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492810 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-public-tls-certs\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492841 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-scripts\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492865 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8f54\" (UniqueName: \"kubernetes.io/projected/28e509d6-d15b-44e6-9afa-05a347c2a7a5-kube-api-access-m8f54\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492890 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-combined-ca-bundle\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.492924 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e509d6-d15b-44e6-9afa-05a347c2a7a5-logs\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.513182 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c6d7bc49b-2f688"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.514879 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.522338 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.522560 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.522697 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.522768 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sjsm8" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.522811 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.534298 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c6d7bc49b-2f688"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594717 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwgc\" (UniqueName: \"kubernetes.io/projected/4879cccb-c711-458b-8ae5-895ec70f6536-kube-api-access-wmwgc\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594776 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-config-data\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594794 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-credential-keys\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594813 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-public-tls-certs\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594841 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-scripts\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594868 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-public-tls-certs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594887 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8f54\" (UniqueName: \"kubernetes.io/projected/28e509d6-d15b-44e6-9afa-05a347c2a7a5-kube-api-access-m8f54\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594917 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-combined-ca-bundle\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594933 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-internal-tls-certs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594966 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-config-data\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.594985 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e509d6-d15b-44e6-9afa-05a347c2a7a5-logs\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595024 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-scripts\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595048 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-fernet-keys\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595082 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595100 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-combined-ca-bundle\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595118 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4879cccb-c711-458b-8ae5-895ec70f6536-logs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595137 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-internal-tls-certs\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595151 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzsn\" (UniqueName: \"kubernetes.io/projected/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-kube-api-access-7tzsn\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595166 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.595197 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.600635 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-scripts\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.601528 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e509d6-d15b-44e6-9afa-05a347c2a7a5-logs\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.602482 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-combined-ca-bundle\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.608519 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-internal-tls-certs\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.611735 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-config-data\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.614479 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-fernet-keys\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.614976 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.615544 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.615777 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.619485 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-credential-keys\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.624540 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-public-tls-certs\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.627235 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzsn\" (UniqueName: \"kubernetes.io/projected/2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a-kube-api-access-7tzsn\") pod \"keystone-64fdbdb744-gdc8l\" (UID: \"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a\") " pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.634603 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8f54\" (UniqueName: \"kubernetes.io/projected/28e509d6-d15b-44e6-9afa-05a347c2a7a5-kube-api-access-m8f54\") pod \"watcher-decision-engine-0\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.657631 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.718414 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-public-tls-certs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.718581 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-internal-tls-certs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.737510 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-config-data\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.737677 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-scripts\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.737790 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-combined-ca-bundle\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.737838 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4879cccb-c711-458b-8ae5-895ec70f6536-logs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.737936 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwgc\" (UniqueName: \"kubernetes.io/projected/4879cccb-c711-458b-8ae5-895ec70f6536-kube-api-access-wmwgc\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.744388 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4879cccb-c711-458b-8ae5-895ec70f6536-logs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.747480 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8698908a-a9e4-46c2-9855-2a1db92b1d75" path="/var/lib/kubelet/pods/8698908a-a9e4-46c2-9855-2a1db92b1d75/volumes" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.750843 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-internal-tls-certs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.752931 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-public-tls-certs\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.754316 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-scripts\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.756690 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-combined-ca-bundle\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.757702 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8acb38c5-f0b5-46e9-8cb7-085d0b12b16d" path="/var/lib/kubelet/pods/8acb38c5-f0b5-46e9-8cb7-085d0b12b16d/volumes" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.758850 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4879cccb-c711-458b-8ae5-895ec70f6536-config-data\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.759944 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90af75ae-3389-4c14-a59d-eaaeee34e58f" path="/var/lib/kubelet/pods/90af75ae-3389-4c14-a59d-eaaeee34e58f/volumes" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.762179 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwgc\" (UniqueName: \"kubernetes.io/projected/4879cccb-c711-458b-8ae5-895ec70f6536-kube-api-access-wmwgc\") pod \"placement-c6d7bc49b-2f688\" (UID: \"4879cccb-c711-458b-8ae5-895ec70f6536\") " pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.764451 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gtfpr" event={"ID":"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b","Type":"ContainerStarted","Data":"ba7b4df22e89a35b4c6557fd502623f1c46ac646b4218b8e0d233117e64ff38e"} Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.773662 4904 generic.go:334] "Generic (PLEG): container finished" podID="f0a41473-f3de-440f-89be-9fddf77f6148" containerID="39156239472b92dabca22f1a6012db800be8051e59e0bc036717ecd11904ceab" exitCode=0 Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.773775 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wh6jz" event={"ID":"f0a41473-f3de-440f-89be-9fddf77f6148","Type":"ContainerDied","Data":"39156239472b92dabca22f1a6012db800be8051e59e0bc036717ecd11904ceab"} Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.789566 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerStarted","Data":"c80d41f49bbce44078ae017a703ab69465e7e3cab4f393288fb4a4b5e4ae8490"} Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.791073 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.792147 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gtfpr" podStartSLOduration=2.928562436 podStartE2EDuration="1m2.792130559s" podCreationTimestamp="2025-12-05 20:32:19 +0000 UTC" firstStartedPulling="2025-12-05 20:32:20.221582357 +0000 UTC m=+1239.032798466" lastFinishedPulling="2025-12-05 20:33:20.08515048 +0000 UTC m=+1298.896366589" observedRunningTime="2025-12-05 20:33:21.789946922 +0000 UTC m=+1300.601163031" watchObservedRunningTime="2025-12-05 20:33:21.792130559 +0000 UTC m=+1300.603346668" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.793591 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"48f600c5-7f88-4c27-a152-d79354212532","Type":"ContainerStarted","Data":"23c40718fdf63e48037b7d58ae6111a30fc97e837cbe611f6b5a339b2cc46cc0"} Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.807846 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:21 crc kubenswrapper[4904]: I1205 20:33:21.849243 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.418508 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-64fdbdb744-gdc8l"] Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.646204 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.846714 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-64fdbdb744-gdc8l" event={"ID":"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a","Type":"ContainerStarted","Data":"74852d9e0cc70a17cdbd41aa85fa80e2452d6ef79fbf959390e169656cdc9efe"} Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.846769 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-64fdbdb744-gdc8l" event={"ID":"2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a","Type":"ContainerStarted","Data":"a124499eefa46905dc164fc0c93288f5130b3384fc92910684a3b3226fee2245"} Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.852398 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:22 crc kubenswrapper[4904]: W1205 20:33:22.865299 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4879cccb_c711_458b_8ae5_895ec70f6536.slice/crio-0121398bab52f79038c62e0c6e4218c2e6cf58945e5da7dfc66d2151b41d96f5 WatchSource:0}: Error finding container 0121398bab52f79038c62e0c6e4218c2e6cf58945e5da7dfc66d2151b41d96f5: Status 404 returned error can't find the container with id 0121398bab52f79038c62e0c6e4218c2e6cf58945e5da7dfc66d2151b41d96f5 Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.868868 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c6d7bc49b-2f688"] Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.874016 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"48f600c5-7f88-4c27-a152-d79354212532","Type":"ContainerStarted","Data":"97e7d8a25526156409eb88d0a5add326205aa24d64613da19cb4f43efe235129"} Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.892576 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-64fdbdb744-gdc8l" podStartSLOduration=1.892550109 podStartE2EDuration="1.892550109s" podCreationTimestamp="2025-12-05 20:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:22.887084505 +0000 UTC m=+1301.698300624" watchObservedRunningTime="2025-12-05 20:33:22.892550109 +0000 UTC m=+1301.703766218" Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.897398 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerStarted","Data":"9cf97b92f3f9b85502353db4d106cfa060774d753bf85e79a8891b9bc516b61f"} Dec 05 20:33:22 crc kubenswrapper[4904]: I1205 20:33:22.912827 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.912808703 podStartE2EDuration="2.912808703s" podCreationTimestamp="2025-12-05 20:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:22.909373282 +0000 UTC m=+1301.720589391" watchObservedRunningTime="2025-12-05 20:33:22.912808703 +0000 UTC m=+1301.724024812" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.396235 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.515751 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-scripts\") pod \"f0a41473-f3de-440f-89be-9fddf77f6148\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.515834 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-config-data\") pod \"f0a41473-f3de-440f-89be-9fddf77f6148\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.515973 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-db-sync-config-data\") pod \"f0a41473-f3de-440f-89be-9fddf77f6148\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.516004 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-combined-ca-bundle\") pod \"f0a41473-f3de-440f-89be-9fddf77f6148\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.516073 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rztw\" (UniqueName: \"kubernetes.io/projected/f0a41473-f3de-440f-89be-9fddf77f6148-kube-api-access-9rztw\") pod \"f0a41473-f3de-440f-89be-9fddf77f6148\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.516130 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0a41473-f3de-440f-89be-9fddf77f6148-etc-machine-id\") pod \"f0a41473-f3de-440f-89be-9fddf77f6148\" (UID: \"f0a41473-f3de-440f-89be-9fddf77f6148\") " Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.516615 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0a41473-f3de-440f-89be-9fddf77f6148-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f0a41473-f3de-440f-89be-9fddf77f6148" (UID: "f0a41473-f3de-440f-89be-9fddf77f6148"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.525966 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a41473-f3de-440f-89be-9fddf77f6148-kube-api-access-9rztw" (OuterVolumeSpecName: "kube-api-access-9rztw") pod "f0a41473-f3de-440f-89be-9fddf77f6148" (UID: "f0a41473-f3de-440f-89be-9fddf77f6148"). InnerVolumeSpecName "kube-api-access-9rztw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.526363 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-scripts" (OuterVolumeSpecName: "scripts") pod "f0a41473-f3de-440f-89be-9fddf77f6148" (UID: "f0a41473-f3de-440f-89be-9fddf77f6148"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.531756 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f0a41473-f3de-440f-89be-9fddf77f6148" (UID: "f0a41473-f3de-440f-89be-9fddf77f6148"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.600672 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0a41473-f3de-440f-89be-9fddf77f6148" (UID: "f0a41473-f3de-440f-89be-9fddf77f6148"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.619250 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.619285 4904 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.619298 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.619311 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rztw\" (UniqueName: \"kubernetes.io/projected/f0a41473-f3de-440f-89be-9fddf77f6148-kube-api-access-9rztw\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.619323 4904 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0a41473-f3de-440f-89be-9fddf77f6148-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.619424 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-config-data" (OuterVolumeSpecName: "config-data") pod "f0a41473-f3de-440f-89be-9fddf77f6148" (UID: "f0a41473-f3de-440f-89be-9fddf77f6148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.721421 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a41473-f3de-440f-89be-9fddf77f6148-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.933675 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k6l4d" event={"ID":"b72d1aa8-3933-4153-89ac-a4ffe0667268","Type":"ContainerStarted","Data":"6a062818e034002eee32678ff30187d7a38d293c79de2a5c2b33c719e5f5af70"} Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.965869 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerStarted","Data":"caa0206c24fe9ac1e7d232dcee952c27d17c80d21785ed9cb6434f05ae73b63c"} Dec 05 20:33:23 crc kubenswrapper[4904]: I1205 20:33:23.995327 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-k6l4d" podStartSLOduration=3.948416138 podStartE2EDuration="1m2.995306921s" podCreationTimestamp="2025-12-05 20:32:21 +0000 UTC" firstStartedPulling="2025-12-05 20:32:23.824575481 +0000 UTC m=+1242.635791590" lastFinishedPulling="2025-12-05 20:33:22.871466264 +0000 UTC m=+1301.682682373" observedRunningTime="2025-12-05 20:33:23.963278368 +0000 UTC m=+1302.774494487" watchObservedRunningTime="2025-12-05 20:33:23.995306921 +0000 UTC m=+1302.806523020" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.028392 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.028371111 podStartE2EDuration="3.028371111s" podCreationTimestamp="2025-12-05 20:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:24.018905402 +0000 UTC m=+1302.830121511" watchObservedRunningTime="2025-12-05 20:33:24.028371111 +0000 UTC m=+1302.839587220" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.044683 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6d7bc49b-2f688" event={"ID":"4879cccb-c711-458b-8ae5-895ec70f6536","Type":"ContainerStarted","Data":"f1b3ba68ada6d0fd608f9f163a2acebff85f5f51891ef9a95791d506ce58acfc"} Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.044725 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6d7bc49b-2f688" event={"ID":"4879cccb-c711-458b-8ae5-895ec70f6536","Type":"ContainerStarted","Data":"32786f773f0c90466b8d5c29793a295cedcd88c205921ca1ab40480db20cdf6c"} Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.044734 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6d7bc49b-2f688" event={"ID":"4879cccb-c711-458b-8ae5-895ec70f6536","Type":"ContainerStarted","Data":"0121398bab52f79038c62e0c6e4218c2e6cf58945e5da7dfc66d2151b41d96f5"} Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.045795 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.045823 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.074762 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wh6jz" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.076098 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wh6jz" event={"ID":"f0a41473-f3de-440f-89be-9fddf77f6148","Type":"ContainerDied","Data":"66192db94a93fb0ef363e7e41e36cb6dd48c7d9fd5ee92e62353ca4b8acb5f61"} Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.076141 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66192db94a93fb0ef363e7e41e36cb6dd48c7d9fd5ee92e62353ca4b8acb5f61" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.093502 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:24 crc kubenswrapper[4904]: E1205 20:33:24.096307 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a41473-f3de-440f-89be-9fddf77f6148" containerName="cinder-db-sync" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.096360 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a41473-f3de-440f-89be-9fddf77f6148" containerName="cinder-db-sync" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.096855 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a41473-f3de-440f-89be-9fddf77f6148" containerName="cinder-db-sync" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.106360 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.111881 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.112100 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.112486 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.112654 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wx7jh" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.117976 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.153099 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.153461 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.153548 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4z5\" (UniqueName: \"kubernetes.io/projected/8976c9f0-073f-425d-a9b9-f66c96578f31-kube-api-access-2s4z5\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.153609 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-scripts\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.153695 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.153878 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8976c9f0-073f-425d-a9b9-f66c96578f31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.160564 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c6d7bc49b-2f688" podStartSLOduration=3.1605454809999998 podStartE2EDuration="3.160545481s" podCreationTimestamp="2025-12-05 20:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:24.080046251 +0000 UTC m=+1302.891262360" watchObservedRunningTime="2025-12-05 20:33:24.160545481 +0000 UTC m=+1302.971761590" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.187211 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fdc8d759-tm87z"] Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.190645 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.214655 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fdc8d759-tm87z"] Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256190 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-svc\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256278 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-sb\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256320 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8976c9f0-073f-425d-a9b9-f66c96578f31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256365 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256422 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256445 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-nb\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256481 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-swift-storage-0\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256508 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4z5\" (UniqueName: \"kubernetes.io/projected/8976c9f0-073f-425d-a9b9-f66c96578f31-kube-api-access-2s4z5\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256543 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbvz\" (UniqueName: \"kubernetes.io/projected/ef666e4f-797c-4868-b3b4-2b834851d840-kube-api-access-9sbvz\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256565 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-scripts\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256594 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256626 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-config\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.256755 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8976c9f0-073f-425d-a9b9-f66c96578f31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.264330 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.266883 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-scripts\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.271289 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.278784 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.282285 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4z5\" (UniqueName: \"kubernetes.io/projected/8976c9f0-073f-425d-a9b9-f66c96578f31-kube-api-access-2s4z5\") pod \"cinder-scheduler-0\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.358242 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbvz\" (UniqueName: \"kubernetes.io/projected/ef666e4f-797c-4868-b3b4-2b834851d840-kube-api-access-9sbvz\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.358309 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-config\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.358336 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-svc\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.358379 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-sb\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.358451 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-nb\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.358470 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-swift-storage-0\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.359371 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-swift-storage-0\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.360131 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-config\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.360728 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-svc\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.361439 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-nb\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.361468 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-sb\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.396576 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbvz\" (UniqueName: \"kubernetes.io/projected/ef666e4f-797c-4868-b3b4-2b834851d840-kube-api-access-9sbvz\") pod \"dnsmasq-dns-55fdc8d759-tm87z\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.414777 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.416901 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.423439 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.460726 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-scripts\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.460789 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwl94\" (UniqueName: \"kubernetes.io/projected/8a21673c-0387-4a14-baf3-855e80f0f7ef-kube-api-access-xwl94\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.460828 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a21673c-0387-4a14-baf3-855e80f0f7ef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.460892 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a21673c-0387-4a14-baf3-855e80f0f7ef-logs\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.460976 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.460993 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.461033 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.467855 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.467933 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.533463 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.570910 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a21673c-0387-4a14-baf3-855e80f0f7ef-logs\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571020 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571040 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571092 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571117 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-scripts\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571136 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwl94\" (UniqueName: \"kubernetes.io/projected/8a21673c-0387-4a14-baf3-855e80f0f7ef-kube-api-access-xwl94\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571165 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a21673c-0387-4a14-baf3-855e80f0f7ef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571232 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a21673c-0387-4a14-baf3-855e80f0f7ef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.571574 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a21673c-0387-4a14-baf3-855e80f0f7ef-logs\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.584782 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.585608 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-scripts\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.592170 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.597828 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.613725 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwl94\" (UniqueName: \"kubernetes.io/projected/8a21673c-0387-4a14-baf3-855e80f0f7ef-kube-api-access-xwl94\") pod \"cinder-api-0\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " pod="openstack/cinder-api-0" Dec 05 20:33:24 crc kubenswrapper[4904]: I1205 20:33:24.761580 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:33:25 crc kubenswrapper[4904]: I1205 20:33:25.284406 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:25 crc kubenswrapper[4904]: I1205 20:33:25.488590 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fdc8d759-tm87z"] Dec 05 20:33:25 crc kubenswrapper[4904]: I1205 20:33:25.619687 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:26 crc kubenswrapper[4904]: I1205 20:33:26.111605 4904 generic.go:334] "Generic (PLEG): container finished" podID="ef666e4f-797c-4868-b3b4-2b834851d840" containerID="2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5" exitCode=0 Dec 05 20:33:26 crc kubenswrapper[4904]: I1205 20:33:26.111975 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" event={"ID":"ef666e4f-797c-4868-b3b4-2b834851d840","Type":"ContainerDied","Data":"2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5"} Dec 05 20:33:26 crc kubenswrapper[4904]: I1205 20:33:26.112009 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" event={"ID":"ef666e4f-797c-4868-b3b4-2b834851d840","Type":"ContainerStarted","Data":"cdf349082d92a762d62ec1d3c8f023e4c8c7bd7acbc26c9b2c38e45b88d08111"} Dec 05 20:33:26 crc kubenswrapper[4904]: I1205 20:33:26.120739 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a21673c-0387-4a14-baf3-855e80f0f7ef","Type":"ContainerStarted","Data":"69c0c0e6b46bceca5a6ce6a6f6bf237c9d0bf4ddfe10b7962f4b91b75119b753"} Dec 05 20:33:26 crc kubenswrapper[4904]: I1205 20:33:26.122661 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8976c9f0-073f-425d-a9b9-f66c96578f31","Type":"ContainerStarted","Data":"4e60ab67612428eab3ccfadc36e27b8ee4715093cbfcce697da1fdf1e6965768"} Dec 05 20:33:26 crc kubenswrapper[4904]: I1205 20:33:26.155733 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 05 20:33:27 crc kubenswrapper[4904]: I1205 20:33:27.142824 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" event={"ID":"ef666e4f-797c-4868-b3b4-2b834851d840","Type":"ContainerStarted","Data":"d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244"} Dec 05 20:33:27 crc kubenswrapper[4904]: I1205 20:33:27.143389 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:27 crc kubenswrapper[4904]: I1205 20:33:27.146518 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a21673c-0387-4a14-baf3-855e80f0f7ef","Type":"ContainerStarted","Data":"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98"} Dec 05 20:33:27 crc kubenswrapper[4904]: I1205 20:33:27.168945 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" podStartSLOduration=3.168927951 podStartE2EDuration="3.168927951s" podCreationTimestamp="2025-12-05 20:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:27.164189606 +0000 UTC m=+1305.975405725" watchObservedRunningTime="2025-12-05 20:33:27.168927951 +0000 UTC m=+1305.980144060" Dec 05 20:33:27 crc kubenswrapper[4904]: I1205 20:33:27.377673 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.158704 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a21673c-0387-4a14-baf3-855e80f0f7ef","Type":"ContainerStarted","Data":"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359"} Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.159030 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.158936 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api" containerID="cri-o://c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359" gracePeriod=30 Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.158898 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api-log" containerID="cri-o://6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98" gracePeriod=30 Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.168747 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8976c9f0-073f-425d-a9b9-f66c96578f31","Type":"ContainerStarted","Data":"6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7"} Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.168794 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8976c9f0-073f-425d-a9b9-f66c96578f31","Type":"ContainerStarted","Data":"2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1"} Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.190199 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.190179147 podStartE2EDuration="4.190179147s" podCreationTimestamp="2025-12-05 20:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:28.179888966 +0000 UTC m=+1306.991105085" watchObservedRunningTime="2025-12-05 20:33:28.190179147 +0000 UTC m=+1307.001395256" Dec 05 20:33:28 crc kubenswrapper[4904]: I1205 20:33:28.217749 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.305547598 podStartE2EDuration="4.217729192s" podCreationTimestamp="2025-12-05 20:33:24 +0000 UTC" firstStartedPulling="2025-12-05 20:33:25.338860822 +0000 UTC m=+1304.150076931" lastFinishedPulling="2025-12-05 20:33:26.251042416 +0000 UTC m=+1305.062258525" observedRunningTime="2025-12-05 20:33:28.211876398 +0000 UTC m=+1307.023092507" watchObservedRunningTime="2025-12-05 20:33:28.217729192 +0000 UTC m=+1307.028945301" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.144882 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181291 4904 generic.go:334] "Generic (PLEG): container finished" podID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerID="c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359" exitCode=0 Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181356 4904 generic.go:334] "Generic (PLEG): container finished" podID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerID="6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98" exitCode=143 Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181435 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a21673c-0387-4a14-baf3-855e80f0f7ef","Type":"ContainerDied","Data":"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359"} Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181493 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a21673c-0387-4a14-baf3-855e80f0f7ef","Type":"ContainerDied","Data":"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98"} Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181510 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a21673c-0387-4a14-baf3-855e80f0f7ef","Type":"ContainerDied","Data":"69c0c0e6b46bceca5a6ce6a6f6bf237c9d0bf4ddfe10b7962f4b91b75119b753"} Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181529 4904 scope.go:117] "RemoveContainer" containerID="c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.181762 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.185692 4904 generic.go:334] "Generic (PLEG): container finished" podID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerID="caa0206c24fe9ac1e7d232dcee952c27d17c80d21785ed9cb6434f05ae73b63c" exitCode=1 Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.186975 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerDied","Data":"caa0206c24fe9ac1e7d232dcee952c27d17c80d21785ed9cb6434f05ae73b63c"} Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.187542 4904 scope.go:117] "RemoveContainer" containerID="caa0206c24fe9ac1e7d232dcee952c27d17c80d21785ed9cb6434f05ae73b63c" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.194967 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a21673c-0387-4a14-baf3-855e80f0f7ef-logs\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.195069 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-scripts\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.195098 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a21673c-0387-4a14-baf3-855e80f0f7ef-etc-machine-id\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.195163 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data-custom\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.195249 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwl94\" (UniqueName: \"kubernetes.io/projected/8a21673c-0387-4a14-baf3-855e80f0f7ef-kube-api-access-xwl94\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.195322 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.195349 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-combined-ca-bundle\") pod \"8a21673c-0387-4a14-baf3-855e80f0f7ef\" (UID: \"8a21673c-0387-4a14-baf3-855e80f0f7ef\") " Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.196380 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a21673c-0387-4a14-baf3-855e80f0f7ef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.196924 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a21673c-0387-4a14-baf3-855e80f0f7ef-logs" (OuterVolumeSpecName: "logs") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.204308 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.218928 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a21673c-0387-4a14-baf3-855e80f0f7ef-kube-api-access-xwl94" (OuterVolumeSpecName: "kube-api-access-xwl94") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "kube-api-access-xwl94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.235899 4904 scope.go:117] "RemoveContainer" containerID="6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.241489 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-scripts" (OuterVolumeSpecName: "scripts") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.244188 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.274964 4904 scope.go:117] "RemoveContainer" containerID="c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359" Dec 05 20:33:29 crc kubenswrapper[4904]: E1205 20:33:29.280376 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359\": container with ID starting with c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359 not found: ID does not exist" containerID="c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.280444 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359"} err="failed to get container status \"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359\": rpc error: code = NotFound desc = could not find container \"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359\": container with ID starting with c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359 not found: ID does not exist" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.280476 4904 scope.go:117] "RemoveContainer" containerID="6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98" Dec 05 20:33:29 crc kubenswrapper[4904]: E1205 20:33:29.282101 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98\": container with ID starting with 6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98 not found: ID does not exist" containerID="6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.282159 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98"} err="failed to get container status \"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98\": rpc error: code = NotFound desc = could not find container \"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98\": container with ID starting with 6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98 not found: ID does not exist" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.282198 4904 scope.go:117] "RemoveContainer" containerID="c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.282587 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359"} err="failed to get container status \"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359\": rpc error: code = NotFound desc = could not find container \"c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359\": container with ID starting with c719b67471a502d2d311b49f52e0a50172206b215f7601e29b2acf07f5624359 not found: ID does not exist" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.282609 4904 scope.go:117] "RemoveContainer" containerID="6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.285478 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98"} err="failed to get container status \"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98\": rpc error: code = NotFound desc = could not find container \"6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98\": container with ID starting with 6099eb544e77746db8535e34cae41677c607c4049f87a7d656fca93a82ffbc98 not found: ID does not exist" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.298138 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.298168 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a21673c-0387-4a14-baf3-855e80f0f7ef-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.298177 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.298186 4904 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a21673c-0387-4a14-baf3-855e80f0f7ef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.298196 4904 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.298203 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwl94\" (UniqueName: \"kubernetes.io/projected/8a21673c-0387-4a14-baf3-855e80f0f7ef-kube-api-access-xwl94\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.316211 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data" (OuterVolumeSpecName: "config-data") pod "8a21673c-0387-4a14-baf3-855e80f0f7ef" (UID: "8a21673c-0387-4a14-baf3-855e80f0f7ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.399658 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a21673c-0387-4a14-baf3-855e80f0f7ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.469119 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.526392 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.536548 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.598235 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:29 crc kubenswrapper[4904]: E1205 20:33:29.598717 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.598744 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api" Dec 05 20:33:29 crc kubenswrapper[4904]: E1205 20:33:29.598789 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api-log" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.598798 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api-log" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.599040 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.599096 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" containerName="cinder-api-log" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.600324 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.602154 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.602218 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.604820 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.611772 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.703256 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a21673c-0387-4a14-baf3-855e80f0f7ef" path="/var/lib/kubelet/pods/8a21673c-0387-4a14-baf3-855e80f0f7ef/volumes" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705222 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-config-data\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705295 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705355 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-config-data-custom\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705383 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-scripts\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705404 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadf2169-1b54-4e50-adff-71504b526259-logs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705450 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dadf2169-1b54-4e50-adff-71504b526259-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705480 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdrk\" (UniqueName: \"kubernetes.io/projected/dadf2169-1b54-4e50-adff-71504b526259-kube-api-access-zmdrk\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705514 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.705537 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807404 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807661 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807738 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-config-data\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807888 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-config-data-custom\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807915 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-scripts\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.807936 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadf2169-1b54-4e50-adff-71504b526259-logs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.808031 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dadf2169-1b54-4e50-adff-71504b526259-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.808104 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdrk\" (UniqueName: \"kubernetes.io/projected/dadf2169-1b54-4e50-adff-71504b526259-kube-api-access-zmdrk\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.808536 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dadf2169-1b54-4e50-adff-71504b526259-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.809662 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadf2169-1b54-4e50-adff-71504b526259-logs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.816950 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.822099 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.822554 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.823049 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-config-data-custom\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.824689 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdrk\" (UniqueName: \"kubernetes.io/projected/dadf2169-1b54-4e50-adff-71504b526259-kube-api-access-zmdrk\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.825082 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-config-data\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.828513 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dadf2169-1b54-4e50-adff-71504b526259-scripts\") pod \"cinder-api-0\" (UID: \"dadf2169-1b54-4e50-adff-71504b526259\") " pod="openstack/cinder-api-0" Dec 05 20:33:29 crc kubenswrapper[4904]: I1205 20:33:29.918715 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:33:30 crc kubenswrapper[4904]: I1205 20:33:30.199761 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerStarted","Data":"a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538"} Dec 05 20:33:31 crc kubenswrapper[4904]: I1205 20:33:31.156340 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 05 20:33:31 crc kubenswrapper[4904]: I1205 20:33:31.193990 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 05 20:33:31 crc kubenswrapper[4904]: I1205 20:33:31.253326 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 05 20:33:31 crc kubenswrapper[4904]: I1205 20:33:31.809450 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:31 crc kubenswrapper[4904]: I1205 20:33:31.845883 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:32 crc kubenswrapper[4904]: I1205 20:33:32.230988 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:32 crc kubenswrapper[4904]: I1205 20:33:32.286230 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:33 crc kubenswrapper[4904]: I1205 20:33:33.496845 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:33:33 crc kubenswrapper[4904]: I1205 20:33:33.507421 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.269137 4904 generic.go:334] "Generic (PLEG): container finished" podID="b72d1aa8-3933-4153-89ac-a4ffe0667268" containerID="6a062818e034002eee32678ff30187d7a38d293c79de2a5c2b33c719e5f5af70" exitCode=0 Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.269503 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k6l4d" event={"ID":"b72d1aa8-3933-4153-89ac-a4ffe0667268","Type":"ContainerDied","Data":"6a062818e034002eee32678ff30187d7a38d293c79de2a5c2b33c719e5f5af70"} Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.289407 4904 generic.go:334] "Generic (PLEG): container finished" podID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerID="a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538" exitCode=1 Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.289460 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerDied","Data":"a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538"} Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.289497 4904 scope.go:117] "RemoveContainer" containerID="caa0206c24fe9ac1e7d232dcee952c27d17c80d21785ed9cb6434f05ae73b63c" Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.290177 4904 scope.go:117] "RemoveContainer" containerID="a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538" Dec 05 20:33:34 crc kubenswrapper[4904]: E1205 20:33:34.290539 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(28e509d6-d15b-44e6-9afa-05a347c2a7a5)\"" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.536303 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.610179 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7569b6c979-qznqm"] Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.610463 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" containerName="dnsmasq-dns" containerID="cri-o://2659f9b6fae95519af45cb197d6d99eace636a1728e5add817cc8abfe59b6322" gracePeriod=10 Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.694234 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 20:33:34 crc kubenswrapper[4904]: I1205 20:33:34.760460 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.304910 4904 generic.go:334] "Generic (PLEG): container finished" podID="310cb505-37fe-420a-8012-3840d7ede2f0" containerID="2659f9b6fae95519af45cb197d6d99eace636a1728e5add817cc8abfe59b6322" exitCode=0 Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.305282 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="cinder-scheduler" containerID="cri-o://2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1" gracePeriod=30 Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.305664 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" event={"ID":"310cb505-37fe-420a-8012-3840d7ede2f0","Type":"ContainerDied","Data":"2659f9b6fae95519af45cb197d6d99eace636a1728e5add817cc8abfe59b6322"} Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.306678 4904 scope.go:117] "RemoveContainer" containerID="a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538" Dec 05 20:33:35 crc kubenswrapper[4904]: E1205 20:33:35.307104 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(28e509d6-d15b-44e6-9afa-05a347c2a7a5)\"" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.307653 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="probe" containerID="cri-o://6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7" gracePeriod=30 Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.534676 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.693702 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-d867d46cb-9mdx2" Dec 05 20:33:35 crc kubenswrapper[4904]: I1205 20:33:35.755192 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6867ddbddb-4lg6w"] Dec 05 20:33:36 crc kubenswrapper[4904]: I1205 20:33:36.314171 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6867ddbddb-4lg6w" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon-log" containerID="cri-o://0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323" gracePeriod=30 Dec 05 20:33:36 crc kubenswrapper[4904]: I1205 20:33:36.314753 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6867ddbddb-4lg6w" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" containerID="cri-o://a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36" gracePeriod=30 Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.082085 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.210753 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vgsp\" (UniqueName: \"kubernetes.io/projected/b72d1aa8-3933-4153-89ac-a4ffe0667268-kube-api-access-6vgsp\") pod \"b72d1aa8-3933-4153-89ac-a4ffe0667268\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.210847 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-db-sync-config-data\") pod \"b72d1aa8-3933-4153-89ac-a4ffe0667268\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.210943 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-combined-ca-bundle\") pod \"b72d1aa8-3933-4153-89ac-a4ffe0667268\" (UID: \"b72d1aa8-3933-4153-89ac-a4ffe0667268\") " Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.216675 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b72d1aa8-3933-4153-89ac-a4ffe0667268-kube-api-access-6vgsp" (OuterVolumeSpecName: "kube-api-access-6vgsp") pod "b72d1aa8-3933-4153-89ac-a4ffe0667268" (UID: "b72d1aa8-3933-4153-89ac-a4ffe0667268"). InnerVolumeSpecName "kube-api-access-6vgsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.217246 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b72d1aa8-3933-4153-89ac-a4ffe0667268" (UID: "b72d1aa8-3933-4153-89ac-a4ffe0667268"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.255885 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b72d1aa8-3933-4153-89ac-a4ffe0667268" (UID: "b72d1aa8-3933-4153-89ac-a4ffe0667268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.313493 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vgsp\" (UniqueName: \"kubernetes.io/projected/b72d1aa8-3933-4153-89ac-a4ffe0667268-kube-api-access-6vgsp\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.313534 4904 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.313543 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d1aa8-3933-4153-89ac-a4ffe0667268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.329259 4904 generic.go:334] "Generic (PLEG): container finished" podID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerID="cf07c1ff4f13d09142b75fdfc765208a644a4a1c894791c04780aa390503f481" exitCode=137 Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.329338 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"fa95b130-3627-4fa6-9bf8-1760d3b24843","Type":"ContainerDied","Data":"cf07c1ff4f13d09142b75fdfc765208a644a4a1c894791c04780aa390503f481"} Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.331889 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k6l4d" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.331880 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k6l4d" event={"ID":"b72d1aa8-3933-4153-89ac-a4ffe0667268","Type":"ContainerDied","Data":"0323224a151a438a0b3d8cb2253948ab6572fefe2052619b9271fedc646898a4"} Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.332087 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0323224a151a438a0b3d8cb2253948ab6572fefe2052619b9271fedc646898a4" Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.334235 4904 generic.go:334] "Generic (PLEG): container finished" podID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerID="a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36" exitCode=0 Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.334301 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6867ddbddb-4lg6w" event={"ID":"499a09a1-c3aa-4fa2-95a8-0d2896a1d978","Type":"ContainerDied","Data":"a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36"} Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.342518 4904 generic.go:334] "Generic (PLEG): container finished" podID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerID="6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7" exitCode=0 Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.342553 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8976c9f0-073f-425d-a9b9-f66c96578f31","Type":"ContainerDied","Data":"6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7"} Dec 05 20:33:37 crc kubenswrapper[4904]: I1205 20:33:37.556575 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.015483 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.016601 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.127671 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xthv5\" (UniqueName: \"kubernetes.io/projected/310cb505-37fe-420a-8012-3840d7ede2f0-kube-api-access-xthv5\") pod \"310cb505-37fe-420a-8012-3840d7ede2f0\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.127721 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-sb\") pod \"310cb505-37fe-420a-8012-3840d7ede2f0\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.127827 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-nb\") pod \"310cb505-37fe-420a-8012-3840d7ede2f0\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.127853 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-config\") pod \"310cb505-37fe-420a-8012-3840d7ede2f0\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.127879 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-swift-storage-0\") pod \"310cb505-37fe-420a-8012-3840d7ede2f0\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.128012 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-svc\") pod \"310cb505-37fe-420a-8012-3840d7ede2f0\" (UID: \"310cb505-37fe-420a-8012-3840d7ede2f0\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.168215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310cb505-37fe-420a-8012-3840d7ede2f0-kube-api-access-xthv5" (OuterVolumeSpecName: "kube-api-access-xthv5") pod "310cb505-37fe-420a-8012-3840d7ede2f0" (UID: "310cb505-37fe-420a-8012-3840d7ede2f0"). InnerVolumeSpecName "kube-api-access-xthv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.200679 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "310cb505-37fe-420a-8012-3840d7ede2f0" (UID: "310cb505-37fe-420a-8012-3840d7ede2f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.231370 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.231389 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xthv5\" (UniqueName: \"kubernetes.io/projected/310cb505-37fe-420a-8012-3840d7ede2f0-kube-api-access-xthv5\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.232842 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "310cb505-37fe-420a-8012-3840d7ede2f0" (UID: "310cb505-37fe-420a-8012-3840d7ede2f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.255598 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-config" (OuterVolumeSpecName: "config") pod "310cb505-37fe-420a-8012-3840d7ede2f0" (UID: "310cb505-37fe-420a-8012-3840d7ede2f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.256673 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "310cb505-37fe-420a-8012-3840d7ede2f0" (UID: "310cb505-37fe-420a-8012-3840d7ede2f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.311430 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "310cb505-37fe-420a-8012-3840d7ede2f0" (UID: "310cb505-37fe-420a-8012-3840d7ede2f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.334869 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.334893 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.334904 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.334913 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/310cb505-37fe-420a-8012-3840d7ede2f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.344350 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.371357 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.376987 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d8548b4f5-cmq7k"] Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377429 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" containerName="dnsmasq-dns" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377442 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" containerName="dnsmasq-dns" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377450 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" containerName="init" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377456 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" containerName="init" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377470 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="probe" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377477 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="probe" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377488 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api-log" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377495 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api-log" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377510 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b72d1aa8-3933-4153-89ac-a4ffe0667268" containerName="barbican-db-sync" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377516 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="b72d1aa8-3933-4153-89ac-a4ffe0667268" containerName="barbican-db-sync" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377539 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="cinder-scheduler" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377544 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="cinder-scheduler" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.377560 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377566 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377741 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="cinder-scheduler" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377752 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377759 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" containerName="watcher-api-log" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377773 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" containerName="dnsmasq-dns" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377781 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="b72d1aa8-3933-4153-89ac-a4ffe0667268" containerName="barbican-db-sync" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.377789 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerName="probe" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.378744 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.388820 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerStarted","Data":"dff968e734be0fd0f4ef4c83170804ebea5cc8e309c1b604bcd46a28277a53a3"} Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.388878 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d8548b4f5-cmq7k"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389089 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="ceilometer-notification-agent" containerID="cri-o://5eb179fbfe30f07744b88d2fb3bdfd23b7f5a02ed5ac324ec2c9cfb9333fbaa8" gracePeriod=30 Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389291 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389350 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="proxy-httpd" containerID="cri-o://dff968e734be0fd0f4ef4c83170804ebea5cc8e309c1b604bcd46a28277a53a3" gracePeriod=30 Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389402 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="sg-core" containerID="cri-o://c80d41f49bbce44078ae017a703ab69465e7e3cab4f393288fb4a4b5e4ae8490" gracePeriod=30 Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389474 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389702 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.389993 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-m695p" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.417130 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" event={"ID":"310cb505-37fe-420a-8012-3840d7ede2f0","Type":"ContainerDied","Data":"07828654c6248622c37291eeaa395a4a36b155f1ff8dc4766b0cea1724a4bb83"} Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.417179 4904 scope.go:117] "RemoveContainer" containerID="2659f9b6fae95519af45cb197d6d99eace636a1728e5add817cc8abfe59b6322" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.417252 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569b6c979-qznqm" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.426549 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.427002 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"fa95b130-3627-4fa6-9bf8-1760d3b24843","Type":"ContainerDied","Data":"852cad448e2f432f65fdb4baeb48cc80136eeea5ba57394a7e20ab4d6a9580fc"} Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.436270 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-config-data\") pod \"fa95b130-3627-4fa6-9bf8-1760d3b24843\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.436325 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-custom-prometheus-ca\") pod \"fa95b130-3627-4fa6-9bf8-1760d3b24843\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.436360 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa95b130-3627-4fa6-9bf8-1760d3b24843-logs\") pod \"fa95b130-3627-4fa6-9bf8-1760d3b24843\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.436477 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-combined-ca-bundle\") pod \"fa95b130-3627-4fa6-9bf8-1760d3b24843\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.436555 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpj4d\" (UniqueName: \"kubernetes.io/projected/fa95b130-3627-4fa6-9bf8-1760d3b24843-kube-api-access-bpj4d\") pod \"fa95b130-3627-4fa6-9bf8-1760d3b24843\" (UID: \"fa95b130-3627-4fa6-9bf8-1760d3b24843\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.437546 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa95b130-3627-4fa6-9bf8-1760d3b24843-logs" (OuterVolumeSpecName: "logs") pod "fa95b130-3627-4fa6-9bf8-1760d3b24843" (UID: "fa95b130-3627-4fa6-9bf8-1760d3b24843"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.439757 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8fb8b568-c7bl2"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.442430 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.458880 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.459462 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa95b130-3627-4fa6-9bf8-1760d3b24843-kube-api-access-bpj4d" (OuterVolumeSpecName: "kube-api-access-bpj4d") pod "fa95b130-3627-4fa6-9bf8-1760d3b24843" (UID: "fa95b130-3627-4fa6-9bf8-1760d3b24843"). InnerVolumeSpecName "kube-api-access-bpj4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.500355 4904 scope.go:117] "RemoveContainer" containerID="50febebccef027b0b1e35cf9c55da24f1d28e1776eb602871a84d06d6caaad6a" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.519922 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8fb8b568-c7bl2"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.538782 4904 generic.go:334] "Generic (PLEG): container finished" podID="8976c9f0-073f-425d-a9b9-f66c96578f31" containerID="2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1" exitCode=0 Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.538894 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8976c9f0-073f-425d-a9b9-f66c96578f31","Type":"ContainerDied","Data":"2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1"} Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.538995 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8976c9f0-073f-425d-a9b9-f66c96578f31","Type":"ContainerDied","Data":"4e60ab67612428eab3ccfadc36e27b8ee4715093cbfcce697da1fdf1e6965768"} Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.539122 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.543910 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8976c9f0-073f-425d-a9b9-f66c96578f31-etc-machine-id\") pod \"8976c9f0-073f-425d-a9b9-f66c96578f31\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.544132 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s4z5\" (UniqueName: \"kubernetes.io/projected/8976c9f0-073f-425d-a9b9-f66c96578f31-kube-api-access-2s4z5\") pod \"8976c9f0-073f-425d-a9b9-f66c96578f31\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.544287 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data\") pod \"8976c9f0-073f-425d-a9b9-f66c96578f31\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.544434 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data-custom\") pod \"8976c9f0-073f-425d-a9b9-f66c96578f31\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.544635 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-combined-ca-bundle\") pod \"8976c9f0-073f-425d-a9b9-f66c96578f31\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.555313 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-scripts\") pod \"8976c9f0-073f-425d-a9b9-f66c96578f31\" (UID: \"8976c9f0-073f-425d-a9b9-f66c96578f31\") " Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.557249 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af9740e-f05b-4d6d-9075-b7038018de84-logs\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.567322 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpfnx\" (UniqueName: \"kubernetes.io/projected/83ab1170-b41e-4a13-b227-8b67b86587cc-kube-api-access-wpfnx\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.567572 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-config-data\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.543975 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8976c9f0-073f-425d-a9b9-f66c96578f31-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8976c9f0-073f-425d-a9b9-f66c96578f31" (UID: "8976c9f0-073f-425d-a9b9-f66c96578f31"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.567921 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-combined-ca-bundle\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.581579 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-combined-ca-bundle\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.581706 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-config-data-custom\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.581847 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-config-data\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.581940 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ab1170-b41e-4a13-b227-8b67b86587cc-logs\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.582093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvddf\" (UniqueName: \"kubernetes.io/projected/7af9740e-f05b-4d6d-9075-b7038018de84-kube-api-access-rvddf\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.582208 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-config-data-custom\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.582344 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa95b130-3627-4fa6-9bf8-1760d3b24843-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.582414 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpj4d\" (UniqueName: \"kubernetes.io/projected/fa95b130-3627-4fa6-9bf8-1760d3b24843-kube-api-access-bpj4d\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.582479 4904 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8976c9f0-073f-425d-a9b9-f66c96578f31-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.593183 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.596021 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "fa95b130-3627-4fa6-9bf8-1760d3b24843" (UID: "fa95b130-3627-4fa6-9bf8-1760d3b24843"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.596197 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8976c9f0-073f-425d-a9b9-f66c96578f31-kube-api-access-2s4z5" (OuterVolumeSpecName: "kube-api-access-2s4z5") pod "8976c9f0-073f-425d-a9b9-f66c96578f31" (UID: "8976c9f0-073f-425d-a9b9-f66c96578f31"). InnerVolumeSpecName "kube-api-access-2s4z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.597236 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8976c9f0-073f-425d-a9b9-f66c96578f31" (UID: "8976c9f0-073f-425d-a9b9-f66c96578f31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.597812 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.603032 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-scripts" (OuterVolumeSpecName: "scripts") pod "8976c9f0-073f-425d-a9b9-f66c96578f31" (UID: "8976c9f0-073f-425d-a9b9-f66c96578f31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.618287 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa95b130-3627-4fa6-9bf8-1760d3b24843" (UID: "fa95b130-3627-4fa6-9bf8-1760d3b24843"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.624190 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-config-data" (OuterVolumeSpecName: "config-data") pod "fa95b130-3627-4fa6-9bf8-1760d3b24843" (UID: "fa95b130-3627-4fa6-9bf8-1760d3b24843"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.625149 4904 scope.go:117] "RemoveContainer" containerID="cf07c1ff4f13d09142b75fdfc765208a644a4a1c894791c04780aa390503f481" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.632193 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658fd98679-9cr7m"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.639139 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.674262 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658fd98679-9cr7m"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.691518 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-config-data-custom\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.692725 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-nb\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.692986 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af9740e-f05b-4d6d-9075-b7038018de84-logs\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.693132 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-svc\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.695615 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7569b6c979-qznqm"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.701903 4904 scope.go:117] "RemoveContainer" containerID="30e14b92445197a4e38e382b251c4ecbc6bcefaca264e118c6c56ba34e8733a5" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.706555 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpfnx\" (UniqueName: \"kubernetes.io/projected/83ab1170-b41e-4a13-b227-8b67b86587cc-kube-api-access-wpfnx\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.706633 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-sb\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.706692 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-config-data\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.708592 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-swift-storage-0\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.710045 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-combined-ca-bundle\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.710168 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-combined-ca-bundle\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.710192 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-config-data-custom\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.710252 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-config-data\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.710286 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ab1170-b41e-4a13-b227-8b67b86587cc-logs\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.710357 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wld77\" (UniqueName: \"kubernetes.io/projected/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-kube-api-access-wld77\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.711210 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-config\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.711245 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvddf\" (UniqueName: \"kubernetes.io/projected/7af9740e-f05b-4d6d-9075-b7038018de84-kube-api-access-rvddf\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.712558 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s4z5\" (UniqueName: \"kubernetes.io/projected/8976c9f0-073f-425d-a9b9-f66c96578f31-kube-api-access-2s4z5\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.713266 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af9740e-f05b-4d6d-9075-b7038018de84-logs\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.715932 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.715983 4904 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.715984 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ab1170-b41e-4a13-b227-8b67b86587cc-logs\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.715994 4904 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.716036 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.716047 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa95b130-3627-4fa6-9bf8-1760d3b24843-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.717492 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-combined-ca-bundle\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.723974 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7569b6c979-qznqm"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.724447 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-config-data-custom\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.726285 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-config-data-custom\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.733051 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-combined-ca-bundle\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.733587 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ab1170-b41e-4a13-b227-8b67b86587cc-config-data\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.740195 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpfnx\" (UniqueName: \"kubernetes.io/projected/83ab1170-b41e-4a13-b227-8b67b86587cc-kube-api-access-wpfnx\") pod \"barbican-keystone-listener-8fb8b568-c7bl2\" (UID: \"83ab1170-b41e-4a13-b227-8b67b86587cc\") " pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.743309 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvddf\" (UniqueName: \"kubernetes.io/projected/7af9740e-f05b-4d6d-9075-b7038018de84-kube-api-access-rvddf\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.746104 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af9740e-f05b-4d6d-9075-b7038018de84-config-data\") pod \"barbican-worker-6d8548b4f5-cmq7k\" (UID: \"7af9740e-f05b-4d6d-9075-b7038018de84\") " pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.748307 4904 scope.go:117] "RemoveContainer" containerID="6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.752812 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8976c9f0-073f-425d-a9b9-f66c96578f31" (UID: "8976c9f0-073f-425d-a9b9-f66c96578f31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.777811 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-597c5664fd-tbwsr"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.813889 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.827153 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wld77\" (UniqueName: \"kubernetes.io/projected/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-kube-api-access-wld77\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.827332 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-config\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.827530 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-nb\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.827697 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-svc\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.834369 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-sb\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.834434 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-swift-storage-0\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.834790 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.833676 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-config\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.836400 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-sb\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.834034 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-svc\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.836967 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-swift-storage-0\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.840233 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.840695 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-nb\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.846409 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-597c5664fd-tbwsr"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.846833 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wld77\" (UniqueName: \"kubernetes.io/projected/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-kube-api-access-wld77\") pod \"dnsmasq-dns-658fd98679-9cr7m\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.847845 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.878425 4904 scope.go:117] "RemoveContainer" containerID="2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.901327 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.917724 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.937609 4904 scope.go:117] "RemoveContainer" containerID="6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938011 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-combined-ca-bundle\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938086 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvqt\" (UniqueName: \"kubernetes.io/projected/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-kube-api-access-dfvqt\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938160 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-logs\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938220 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938245 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data-custom\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.938340 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7\": container with ID starting with 6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7 not found: ID does not exist" containerID="6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938421 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7"} err="failed to get container status \"6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7\": rpc error: code = NotFound desc = could not find container \"6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7\": container with ID starting with 6c9d379a0f5432797dc4f6682c381a6ac6e3d4d22ca7351ebdf6b7af42a519f7 not found: ID does not exist" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.938494 4904 scope.go:117] "RemoveContainer" containerID="2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1" Dec 05 20:33:38 crc kubenswrapper[4904]: E1205 20:33:38.945807 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1\": container with ID starting with 2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1 not found: ID does not exist" containerID="2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.945855 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1"} err="failed to get container status \"2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1\": rpc error: code = NotFound desc = could not find container \"2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1\": container with ID starting with 2970e40d837f7480736d855ef553136c29d13d721220fdbf2fb47edff1fa15b1 not found: ID does not exist" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.945903 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.948309 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.954388 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.954868 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data" (OuterVolumeSpecName: "config-data") pod "8976c9f0-073f-425d-a9b9-f66c96578f31" (UID: "8976c9f0-073f-425d-a9b9-f66c96578f31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:38 crc kubenswrapper[4904]: I1205 20:33:38.991032 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.017032 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.017185 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d8548b4f5-cmq7k" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044306 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-combined-ca-bundle\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044379 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvqt\" (UniqueName: \"kubernetes.io/projected/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-kube-api-access-dfvqt\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044454 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdb6\" (UniqueName: \"kubernetes.io/projected/3420fce6-9733-45e2-8167-a91c39e372af-kube-api-access-kmdb6\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044495 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420fce6-9733-45e2-8167-a91c39e372af-logs\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044583 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044610 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044647 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-logs\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044713 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044753 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data-custom\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044800 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-config-data\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.044856 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8976c9f0-073f-425d-a9b9-f66c96578f31-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.046874 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-logs\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.069871 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-combined-ca-bundle\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.070110 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvqt\" (UniqueName: \"kubernetes.io/projected/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-kube-api-access-dfvqt\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.075026 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.089898 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data-custom\") pod \"barbican-api-597c5664fd-tbwsr\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.148446 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdb6\" (UniqueName: \"kubernetes.io/projected/3420fce6-9733-45e2-8167-a91c39e372af-kube-api-access-kmdb6\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.148531 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420fce6-9733-45e2-8167-a91c39e372af-logs\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.148584 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.148608 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.148725 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-config-data\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.149451 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420fce6-9733-45e2-8167-a91c39e372af-logs\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.154699 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.154837 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-config-data\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.160049 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.184792 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdb6\" (UniqueName: \"kubernetes.io/projected/3420fce6-9733-45e2-8167-a91c39e372af-kube-api-access-kmdb6\") pod \"watcher-api-0\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.202507 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.288109 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.293519 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.299988 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.320911 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.322752 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.327313 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.335847 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.353679 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwwv\" (UniqueName: \"kubernetes.io/projected/60174921-7963-4d63-83e7-8b702d3e9dd2-kube-api-access-qmwwv\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.353723 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.353757 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-config-data\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.353783 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60174921-7963-4d63-83e7-8b702d3e9dd2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.353813 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.353874 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-scripts\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.455899 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwwv\" (UniqueName: \"kubernetes.io/projected/60174921-7963-4d63-83e7-8b702d3e9dd2-kube-api-access-qmwwv\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.456276 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.456311 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-config-data\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.456340 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60174921-7963-4d63-83e7-8b702d3e9dd2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.456368 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.457001 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-scripts\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.457568 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60174921-7963-4d63-83e7-8b702d3e9dd2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.459690 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.461939 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-scripts\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.464999 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-config-data\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.467456 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60174921-7963-4d63-83e7-8b702d3e9dd2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.486436 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwwv\" (UniqueName: \"kubernetes.io/projected/60174921-7963-4d63-83e7-8b702d3e9dd2-kube-api-access-qmwwv\") pod \"cinder-scheduler-0\" (UID: \"60174921-7963-4d63-83e7-8b702d3e9dd2\") " pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.544834 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8fb8b568-c7bl2"] Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.591361 4904 generic.go:334] "Generic (PLEG): container finished" podID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerID="416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308" exitCode=137 Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.591442 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cf6d474f7-dfz4j" event={"ID":"70255ecd-6213-48cd-bd22-e6e14bbc497d","Type":"ContainerDied","Data":"416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308"} Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.599503 4904 generic.go:334] "Generic (PLEG): container finished" podID="81a8e33f-1179-4192-9d21-b0f520c41656" containerID="dff968e734be0fd0f4ef4c83170804ebea5cc8e309c1b604bcd46a28277a53a3" exitCode=0 Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.599527 4904 generic.go:334] "Generic (PLEG): container finished" podID="81a8e33f-1179-4192-9d21-b0f520c41656" containerID="c80d41f49bbce44078ae017a703ab69465e7e3cab4f393288fb4a4b5e4ae8490" exitCode=2 Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.599558 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerDied","Data":"dff968e734be0fd0f4ef4c83170804ebea5cc8e309c1b604bcd46a28277a53a3"} Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.599580 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerDied","Data":"c80d41f49bbce44078ae017a703ab69465e7e3cab4f393288fb4a4b5e4ae8490"} Dec 05 20:33:39 crc kubenswrapper[4904]: W1205 20:33:39.600827 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83ab1170_b41e_4a13_b227_8b67b86587cc.slice/crio-7bf831141f7e47d3ed652017f82ae3df90cffb40b6a82e05dfa79f0f6f138d45 WatchSource:0}: Error finding container 7bf831141f7e47d3ed652017f82ae3df90cffb40b6a82e05dfa79f0f6f138d45: Status 404 returned error can't find the container with id 7bf831141f7e47d3ed652017f82ae3df90cffb40b6a82e05dfa79f0f6f138d45 Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.601226 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dadf2169-1b54-4e50-adff-71504b526259","Type":"ContainerStarted","Data":"6189663809ddb730efd8b4ca771fe2a1b4c6e22d3075ed952702941cf030ad67"} Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.706830 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.709843 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310cb505-37fe-420a-8012-3840d7ede2f0" path="/var/lib/kubelet/pods/310cb505-37fe-420a-8012-3840d7ede2f0/volumes" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.718620 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8976c9f0-073f-425d-a9b9-f66c96578f31" path="/var/lib/kubelet/pods/8976c9f0-073f-425d-a9b9-f66c96578f31/volumes" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.719537 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa95b130-3627-4fa6-9bf8-1760d3b24843" path="/var/lib/kubelet/pods/fa95b130-3627-4fa6-9bf8-1760d3b24843/volumes" Dec 05 20:33:39 crc kubenswrapper[4904]: I1205 20:33:39.968925 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d8548b4f5-cmq7k"] Dec 05 20:33:39 crc kubenswrapper[4904]: W1205 20:33:39.990280 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7af9740e_f05b_4d6d_9075_b7038018de84.slice/crio-bf1e28357a390196987ee01b17e16554fb72825c7fd041081074614040677921 WatchSource:0}: Error finding container bf1e28357a390196987ee01b17e16554fb72825c7fd041081074614040677921: Status 404 returned error can't find the container with id bf1e28357a390196987ee01b17e16554fb72825c7fd041081074614040677921 Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.113468 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658fd98679-9cr7m"] Dec 05 20:33:40 crc kubenswrapper[4904]: W1205 20:33:40.149677 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfab0d2a9_0213_464a_b0a9_3e7338d7f40d.slice/crio-30c4e5830a294a7190d0a7efcb49f441a09ec0249d089a42981bf09c66ba8d29 WatchSource:0}: Error finding container 30c4e5830a294a7190d0a7efcb49f441a09ec0249d089a42981bf09c66ba8d29: Status 404 returned error can't find the container with id 30c4e5830a294a7190d0a7efcb49f441a09ec0249d089a42981bf09c66ba8d29 Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.323288 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-597c5664fd-tbwsr"] Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.330687 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:33:40 crc kubenswrapper[4904]: W1205 20:33:40.347477 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fa79fda_f8c4_4866_8a82_5e2bc4cdb317.slice/crio-87a3ad10ba7e7138a50f0020c99c1d0a819946a80561bcf8040af52ecf079ead WatchSource:0}: Error finding container 87a3ad10ba7e7138a50f0020c99c1d0a819946a80561bcf8040af52ecf079ead: Status 404 returned error can't find the container with id 87a3ad10ba7e7138a50f0020c99c1d0a819946a80561bcf8040af52ecf079ead Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.360844 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.380701 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-scripts\") pod \"70255ecd-6213-48cd-bd22-e6e14bbc497d\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.380784 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70255ecd-6213-48cd-bd22-e6e14bbc497d-horizon-secret-key\") pod \"70255ecd-6213-48cd-bd22-e6e14bbc497d\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.380844 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gwww\" (UniqueName: \"kubernetes.io/projected/70255ecd-6213-48cd-bd22-e6e14bbc497d-kube-api-access-4gwww\") pod \"70255ecd-6213-48cd-bd22-e6e14bbc497d\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.380969 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70255ecd-6213-48cd-bd22-e6e14bbc497d-logs\") pod \"70255ecd-6213-48cd-bd22-e6e14bbc497d\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.381018 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-config-data\") pod \"70255ecd-6213-48cd-bd22-e6e14bbc497d\" (UID: \"70255ecd-6213-48cd-bd22-e6e14bbc497d\") " Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.382510 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70255ecd-6213-48cd-bd22-e6e14bbc497d-logs" (OuterVolumeSpecName: "logs") pod "70255ecd-6213-48cd-bd22-e6e14bbc497d" (UID: "70255ecd-6213-48cd-bd22-e6e14bbc497d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.387404 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70255ecd-6213-48cd-bd22-e6e14bbc497d-kube-api-access-4gwww" (OuterVolumeSpecName: "kube-api-access-4gwww") pod "70255ecd-6213-48cd-bd22-e6e14bbc497d" (UID: "70255ecd-6213-48cd-bd22-e6e14bbc497d"). InnerVolumeSpecName "kube-api-access-4gwww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.389393 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70255ecd-6213-48cd-bd22-e6e14bbc497d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "70255ecd-6213-48cd-bd22-e6e14bbc497d" (UID: "70255ecd-6213-48cd-bd22-e6e14bbc497d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.411127 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-config-data" (OuterVolumeSpecName: "config-data") pod "70255ecd-6213-48cd-bd22-e6e14bbc497d" (UID: "70255ecd-6213-48cd-bd22-e6e14bbc497d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.469689 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-scripts" (OuterVolumeSpecName: "scripts") pod "70255ecd-6213-48cd-bd22-e6e14bbc497d" (UID: "70255ecd-6213-48cd-bd22-e6e14bbc497d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.483776 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.483814 4904 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70255ecd-6213-48cd-bd22-e6e14bbc497d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.483830 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gwww\" (UniqueName: \"kubernetes.io/projected/70255ecd-6213-48cd-bd22-e6e14bbc497d-kube-api-access-4gwww\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.483840 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70255ecd-6213-48cd-bd22-e6e14bbc497d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.483852 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70255ecd-6213-48cd-bd22-e6e14bbc497d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.576941 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:33:40 crc kubenswrapper[4904]: W1205 20:33:40.600841 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60174921_7963_4d63_83e7_8b702d3e9dd2.slice/crio-a406d3f09d41ba18505191c872a12e3b8ab8ec38c4225139967f2da0dd7f40c3 WatchSource:0}: Error finding container a406d3f09d41ba18505191c872a12e3b8ab8ec38c4225139967f2da0dd7f40c3: Status 404 returned error can't find the container with id a406d3f09d41ba18505191c872a12e3b8ab8ec38c4225139967f2da0dd7f40c3 Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.616367 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597c5664fd-tbwsr" event={"ID":"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317","Type":"ContainerStarted","Data":"87a3ad10ba7e7138a50f0020c99c1d0a819946a80561bcf8040af52ecf079ead"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.619149 4904 generic.go:334] "Generic (PLEG): container finished" podID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerID="99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f" exitCode=137 Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.619203 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cf6d474f7-dfz4j" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.619216 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cf6d474f7-dfz4j" event={"ID":"70255ecd-6213-48cd-bd22-e6e14bbc497d","Type":"ContainerDied","Data":"99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.619247 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cf6d474f7-dfz4j" event={"ID":"70255ecd-6213-48cd-bd22-e6e14bbc497d","Type":"ContainerDied","Data":"f22feab61a5298c4ea9d05d764626891bdca6c9f729ed83ae55744fadc6122dc"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.619263 4904 scope.go:117] "RemoveContainer" containerID="416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.623460 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" event={"ID":"83ab1170-b41e-4a13-b227-8b67b86587cc","Type":"ContainerStarted","Data":"7bf831141f7e47d3ed652017f82ae3df90cffb40b6a82e05dfa79f0f6f138d45"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.625236 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d8548b4f5-cmq7k" event={"ID":"7af9740e-f05b-4d6d-9075-b7038018de84","Type":"ContainerStarted","Data":"bf1e28357a390196987ee01b17e16554fb72825c7fd041081074614040677921"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.629643 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60174921-7963-4d63-83e7-8b702d3e9dd2","Type":"ContainerStarted","Data":"a406d3f09d41ba18505191c872a12e3b8ab8ec38c4225139967f2da0dd7f40c3"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.635012 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dadf2169-1b54-4e50-adff-71504b526259","Type":"ContainerStarted","Data":"3ce656ed78691b1887639e3e02039748d4a43a325e0f9eb4db0e7333291d8f14"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.638395 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3420fce6-9733-45e2-8167-a91c39e372af","Type":"ContainerStarted","Data":"93fc59475144f4440d0c606bb53dc644487806f3100cf2e603e498113ab787be"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.642243 4904 generic.go:334] "Generic (PLEG): container finished" podID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerID="5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5" exitCode=0 Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.642342 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" event={"ID":"fab0d2a9-0213-464a-b0a9-3e7338d7f40d","Type":"ContainerDied","Data":"5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.642377 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" event={"ID":"fab0d2a9-0213-464a-b0a9-3e7338d7f40d","Type":"ContainerStarted","Data":"30c4e5830a294a7190d0a7efcb49f441a09ec0249d089a42981bf09c66ba8d29"} Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.690132 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cf6d474f7-dfz4j"] Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.700267 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-cf6d474f7-dfz4j"] Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.726003 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6867ddbddb-4lg6w" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.847337 4904 scope.go:117] "RemoveContainer" containerID="99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.893432 4904 scope.go:117] "RemoveContainer" containerID="416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308" Dec 05 20:33:40 crc kubenswrapper[4904]: E1205 20:33:40.893937 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308\": container with ID starting with 416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308 not found: ID does not exist" containerID="416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.893992 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308"} err="failed to get container status \"416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308\": rpc error: code = NotFound desc = could not find container \"416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308\": container with ID starting with 416a767194406302c2c55b712023912e302ab5e79d5dfb6ebf25bba6819f4308 not found: ID does not exist" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.894026 4904 scope.go:117] "RemoveContainer" containerID="99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f" Dec 05 20:33:40 crc kubenswrapper[4904]: E1205 20:33:40.894679 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f\": container with ID starting with 99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f not found: ID does not exist" containerID="99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f" Dec 05 20:33:40 crc kubenswrapper[4904]: I1205 20:33:40.894708 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f"} err="failed to get container status \"99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f\": rpc error: code = NotFound desc = could not find container \"99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f\": container with ID starting with 99d86d8381f7ca45382704af8af4b6ac85f3a92a9057ef6d1984ede8b3ab507f not found: ID does not exist" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.719009 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" path="/var/lib/kubelet/pods/70255ecd-6213-48cd-bd22-e6e14bbc497d/volumes" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.730006 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60174921-7963-4d63-83e7-8b702d3e9dd2","Type":"ContainerStarted","Data":"e19363c4df734d3c7399b68d27f00eddaf3ddc694eb93f8b682bea368860cedb"} Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.762741 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dadf2169-1b54-4e50-adff-71504b526259","Type":"ContainerStarted","Data":"2c14a78170fe152f04e34559a52ef0a3c78c5da28dbf78bc1a5abc5705717d91"} Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.764569 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.766517 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3420fce6-9733-45e2-8167-a91c39e372af","Type":"ContainerStarted","Data":"c4d13fcb4fee0443ab536165c27f431bedb1abd70ba2af7e23bb12d9cc88dea9"} Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.791822 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66d5cfdcdd-d59d6"] Dec 05 20:33:41 crc kubenswrapper[4904]: E1205 20:33:41.792321 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon-log" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.792344 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon-log" Dec 05 20:33:41 crc kubenswrapper[4904]: E1205 20:33:41.792370 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.792379 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.792640 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon-log" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.792662 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="70255ecd-6213-48cd-bd22-e6e14bbc497d" containerName="horizon" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.794862 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.811269 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.811498 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.811645 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.812399 4904 scope.go:117] "RemoveContainer" containerID="a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538" Dec 05 20:33:41 crc kubenswrapper[4904]: E1205 20:33:41.812689 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(28e509d6-d15b-44e6-9afa-05a347c2a7a5)\"" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.819849 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-combined-ca-bundle\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.819933 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-public-tls-certs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.819953 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5qhw\" (UniqueName: \"kubernetes.io/projected/f786b88c-ea37-4a02-bdd1-1f9feca9993a-kube-api-access-z5qhw\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.819988 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f786b88c-ea37-4a02-bdd1-1f9feca9993a-logs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.820030 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-config-data-custom\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.820074 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-config-data\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.820094 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-internal-tls-certs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.829303 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66d5cfdcdd-d59d6"] Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.871927 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597c5664fd-tbwsr" event={"ID":"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317","Type":"ContainerStarted","Data":"550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36"} Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.922800 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-combined-ca-bundle\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.923010 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-public-tls-certs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.923079 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5qhw\" (UniqueName: \"kubernetes.io/projected/f786b88c-ea37-4a02-bdd1-1f9feca9993a-kube-api-access-z5qhw\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.923133 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f786b88c-ea37-4a02-bdd1-1f9feca9993a-logs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.923237 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-config-data-custom\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.923316 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-config-data\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.923363 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-internal-tls-certs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.925607 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f786b88c-ea37-4a02-bdd1-1f9feca9993a-logs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.934226 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-config-data\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.938686 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-combined-ca-bundle\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.941030 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-internal-tls-certs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.951694 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-public-tls-certs\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.977027 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f786b88c-ea37-4a02-bdd1-1f9feca9993a-config-data-custom\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:41 crc kubenswrapper[4904]: I1205 20:33:41.977484 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5qhw\" (UniqueName: \"kubernetes.io/projected/f786b88c-ea37-4a02-bdd1-1f9feca9993a-kube-api-access-z5qhw\") pod \"barbican-api-66d5cfdcdd-d59d6\" (UID: \"f786b88c-ea37-4a02-bdd1-1f9feca9993a\") " pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.102162 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=13.102137289 podStartE2EDuration="13.102137289s" podCreationTimestamp="2025-12-05 20:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:42.092934817 +0000 UTC m=+1320.904150946" watchObservedRunningTime="2025-12-05 20:33:42.102137289 +0000 UTC m=+1320.913353398" Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.103256 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78cf6bb7c7-zmmjl" Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.114169 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.237457 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c784f7484-wz74b"] Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.238255 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c784f7484-wz74b" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-httpd" containerID="cri-o://b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9" gracePeriod=30 Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.237923 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c784f7484-wz74b" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-api" containerID="cri-o://249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea" gracePeriod=30 Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.882140 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" event={"ID":"fab0d2a9-0213-464a-b0a9-3e7338d7f40d","Type":"ContainerStarted","Data":"3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b"} Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.882229 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.884268 4904 generic.go:334] "Generic (PLEG): container finished" podID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerID="b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9" exitCode=0 Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.884352 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c784f7484-wz74b" event={"ID":"eabc2178-31e6-49e7-a78c-58dd769c751c","Type":"ContainerDied","Data":"b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9"} Dec 05 20:33:42 crc kubenswrapper[4904]: I1205 20:33:42.902476 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" podStartSLOduration=4.902461199 podStartE2EDuration="4.902461199s" podCreationTimestamp="2025-12-05 20:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:42.899817449 +0000 UTC m=+1321.711033568" watchObservedRunningTime="2025-12-05 20:33:42.902461199 +0000 UTC m=+1321.713677308" Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.741416 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66d5cfdcdd-d59d6"] Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.909637 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597c5664fd-tbwsr" event={"ID":"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317","Type":"ContainerStarted","Data":"212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043"} Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.910711 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.910740 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.912123 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" event={"ID":"83ab1170-b41e-4a13-b227-8b67b86587cc","Type":"ContainerStarted","Data":"82c15b26b81b5b589f1f1a952c3c545d148152f83bcabd8100a4d275b3eed633"} Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.914051 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d8548b4f5-cmq7k" event={"ID":"7af9740e-f05b-4d6d-9075-b7038018de84","Type":"ContainerStarted","Data":"72572f72642e5214e5246dedb00eaf75a534d9cbd4a464cebbe277fb202143eb"} Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.916663 4904 generic.go:334] "Generic (PLEG): container finished" podID="81a8e33f-1179-4192-9d21-b0f520c41656" containerID="5eb179fbfe30f07744b88d2fb3bdfd23b7f5a02ed5ac324ec2c9cfb9333fbaa8" exitCode=0 Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.916724 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerDied","Data":"5eb179fbfe30f07744b88d2fb3bdfd23b7f5a02ed5ac324ec2c9cfb9333fbaa8"} Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.919661 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3420fce6-9733-45e2-8167-a91c39e372af","Type":"ContainerStarted","Data":"cd0939fb41199c51ecd8b69fd3b98d8721c30967fb755ec989c72bb7b6bb72d7"} Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.920164 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.932334 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d5cfdcdd-d59d6" event={"ID":"f786b88c-ea37-4a02-bdd1-1f9feca9993a","Type":"ContainerStarted","Data":"3bb448753dbf9c17e72582a95b279dde7fa19997135cad04e1ddfafcefaaa4b0"} Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.944107 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-597c5664fd-tbwsr" podStartSLOduration=5.944088861 podStartE2EDuration="5.944088861s" podCreationTimestamp="2025-12-05 20:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:43.940372434 +0000 UTC m=+1322.751588563" watchObservedRunningTime="2025-12-05 20:33:43.944088861 +0000 UTC m=+1322.755304970" Dec 05 20:33:43 crc kubenswrapper[4904]: I1205 20:33:43.969519 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=5.968378511 podStartE2EDuration="5.968378511s" podCreationTimestamp="2025-12-05 20:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:43.962910877 +0000 UTC m=+1322.774126996" watchObservedRunningTime="2025-12-05 20:33:43.968378511 +0000 UTC m=+1322.779594620" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.057072 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.095640 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-log-httpd\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.095701 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-config-data\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.095796 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-run-httpd\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.095869 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-sg-core-conf-yaml\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.095976 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-combined-ca-bundle\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.096045 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crhbm\" (UniqueName: \"kubernetes.io/projected/81a8e33f-1179-4192-9d21-b0f520c41656-kube-api-access-crhbm\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.096156 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-scripts\") pod \"81a8e33f-1179-4192-9d21-b0f520c41656\" (UID: \"81a8e33f-1179-4192-9d21-b0f520c41656\") " Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.097937 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.098686 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.102467 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-scripts" (OuterVolumeSpecName: "scripts") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.103643 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a8e33f-1179-4192-9d21-b0f520c41656-kube-api-access-crhbm" (OuterVolumeSpecName: "kube-api-access-crhbm") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "kube-api-access-crhbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.159205 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.207544 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crhbm\" (UniqueName: \"kubernetes.io/projected/81a8e33f-1179-4192-9d21-b0f520c41656-kube-api-access-crhbm\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.207584 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.207596 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.207606 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81a8e33f-1179-4192-9d21-b0f520c41656-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.207617 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.214611 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.295973 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.296255 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-config-data" (OuterVolumeSpecName: "config-data") pod "81a8e33f-1179-4192-9d21-b0f520c41656" (UID: "81a8e33f-1179-4192-9d21-b0f520c41656"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.310628 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.310662 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a8e33f-1179-4192-9d21-b0f520c41656-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.945598 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d5cfdcdd-d59d6" event={"ID":"f786b88c-ea37-4a02-bdd1-1f9feca9993a","Type":"ContainerStarted","Data":"428de35f00b7ed4e9a9d11b3bcc323793e0067fd0ae05c2a7ef308c489ec2a19"} Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.945868 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.945884 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d5cfdcdd-d59d6" event={"ID":"f786b88c-ea37-4a02-bdd1-1f9feca9993a","Type":"ContainerStarted","Data":"0b8ca98b8004bf7cd02300853bf06ddfc90a5dd70972dbefef9d8797237d1d9e"} Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.945899 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.961136 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" event={"ID":"83ab1170-b41e-4a13-b227-8b67b86587cc","Type":"ContainerStarted","Data":"aab8630a6b2f03169d6bd3fa24fca512a5fffb6c5b9e1e0586779903fd535046"} Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.970599 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d8548b4f5-cmq7k" event={"ID":"7af9740e-f05b-4d6d-9075-b7038018de84","Type":"ContainerStarted","Data":"3684920e95f614665ab5456815413381a269747fff607cf9030dcb9dd0745fa3"} Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.976395 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60174921-7963-4d63-83e7-8b702d3e9dd2","Type":"ContainerStarted","Data":"af47f4bbfcfef98699b704337c61c36b328c2e2f160d2386f38b0a5c23141e3e"} Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.993761 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81a8e33f-1179-4192-9d21-b0f520c41656","Type":"ContainerDied","Data":"f3c09fff7467111eb288365f5fa840b91d63c88891f5a9b9aebe8f70f37bf9a9"} Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.993826 4904 scope.go:117] "RemoveContainer" containerID="dff968e734be0fd0f4ef4c83170804ebea5cc8e309c1b604bcd46a28277a53a3" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.994078 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:33:44 crc kubenswrapper[4904]: I1205 20:33:44.996801 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66d5cfdcdd-d59d6" podStartSLOduration=3.996777635 podStartE2EDuration="3.996777635s" podCreationTimestamp="2025-12-05 20:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:44.987867411 +0000 UTC m=+1323.799083520" watchObservedRunningTime="2025-12-05 20:33:44.996777635 +0000 UTC m=+1323.807993744" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.013567 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d8548b4f5-cmq7k" podStartSLOduration=3.777579415 podStartE2EDuration="7.013549207s" podCreationTimestamp="2025-12-05 20:33:38 +0000 UTC" firstStartedPulling="2025-12-05 20:33:40.014245022 +0000 UTC m=+1318.825461131" lastFinishedPulling="2025-12-05 20:33:43.250214814 +0000 UTC m=+1322.061430923" observedRunningTime="2025-12-05 20:33:45.0095343 +0000 UTC m=+1323.820750419" watchObservedRunningTime="2025-12-05 20:33:45.013549207 +0000 UTC m=+1323.824765306" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.057802 4904 scope.go:117] "RemoveContainer" containerID="c80d41f49bbce44078ae017a703ab69465e7e3cab4f393288fb4a4b5e4ae8490" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.099157 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8fb8b568-c7bl2" podStartSLOduration=3.505818349 podStartE2EDuration="7.099134319s" podCreationTimestamp="2025-12-05 20:33:38 +0000 UTC" firstStartedPulling="2025-12-05 20:33:39.615586237 +0000 UTC m=+1318.426802346" lastFinishedPulling="2025-12-05 20:33:43.208902207 +0000 UTC m=+1322.020118316" observedRunningTime="2025-12-05 20:33:45.035339341 +0000 UTC m=+1323.846555450" watchObservedRunningTime="2025-12-05 20:33:45.099134319 +0000 UTC m=+1323.910350428" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.114887 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.114845653 podStartE2EDuration="6.114845653s" podCreationTimestamp="2025-12-05 20:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:45.06799274 +0000 UTC m=+1323.879208859" watchObservedRunningTime="2025-12-05 20:33:45.114845653 +0000 UTC m=+1323.926061762" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.159781 4904 scope.go:117] "RemoveContainer" containerID="5eb179fbfe30f07744b88d2fb3bdfd23b7f5a02ed5ac324ec2c9cfb9333fbaa8" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.234130 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.237495 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.248213 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:33:45 crc kubenswrapper[4904]: E1205 20:33:45.248802 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="ceilometer-notification-agent" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.248831 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="ceilometer-notification-agent" Dec 05 20:33:45 crc kubenswrapper[4904]: E1205 20:33:45.248856 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="sg-core" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.248865 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="sg-core" Dec 05 20:33:45 crc kubenswrapper[4904]: E1205 20:33:45.248877 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="proxy-httpd" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.248885 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="proxy-httpd" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.249161 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="proxy-httpd" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.249189 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="sg-core" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.249208 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" containerName="ceilometer-notification-agent" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.251438 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.256803 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.257364 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.257733 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263522 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263652 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-scripts\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263687 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263707 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-run-httpd\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263730 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-config-data\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263776 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-log-httpd\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.263793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l48hl\" (UniqueName: \"kubernetes.io/projected/e22555a8-bc28-4b25-bd10-4fc9525276c3-kube-api-access-l48hl\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.365629 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-config-data\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.365721 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-log-httpd\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.365753 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l48hl\" (UniqueName: \"kubernetes.io/projected/e22555a8-bc28-4b25-bd10-4fc9525276c3-kube-api-access-l48hl\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.365826 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.365954 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-scripts\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.365997 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.366028 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-run-httpd\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.366865 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-run-httpd\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.369801 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-log-httpd\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.382483 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.382644 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.385008 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-config-data\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.393041 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l48hl\" (UniqueName: \"kubernetes.io/projected/e22555a8-bc28-4b25-bd10-4fc9525276c3-kube-api-access-l48hl\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.393621 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-scripts\") pod \"ceilometer-0\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.592463 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.704502 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a8e33f-1179-4192-9d21-b0f520c41656" path="/var/lib/kubelet/pods/81a8e33f-1179-4192-9d21-b0f520c41656/volumes" Dec 05 20:33:45 crc kubenswrapper[4904]: I1205 20:33:45.978018 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.018841 4904 generic.go:334] "Generic (PLEG): container finished" podID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerID="249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea" exitCode=0 Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.018901 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c784f7484-wz74b" event={"ID":"eabc2178-31e6-49e7-a78c-58dd769c751c","Type":"ContainerDied","Data":"249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea"} Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.018929 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c784f7484-wz74b" event={"ID":"eabc2178-31e6-49e7-a78c-58dd769c751c","Type":"ContainerDied","Data":"89264106edf16cb018b2c36cf1c825aba7bc4cbe6a25a2bdbb581a994d75b093"} Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.018950 4904 scope.go:117] "RemoveContainer" containerID="b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.019075 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c784f7484-wz74b" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.024366 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.065530 4904 scope.go:117] "RemoveContainer" containerID="249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.088712 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-config\") pod \"eabc2178-31e6-49e7-a78c-58dd769c751c\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.090027 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-httpd-config\") pod \"eabc2178-31e6-49e7-a78c-58dd769c751c\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.090109 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-ovndb-tls-certs\") pod \"eabc2178-31e6-49e7-a78c-58dd769c751c\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.090145 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-combined-ca-bundle\") pod \"eabc2178-31e6-49e7-a78c-58dd769c751c\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.090167 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f8jf\" (UniqueName: \"kubernetes.io/projected/eabc2178-31e6-49e7-a78c-58dd769c751c-kube-api-access-5f8jf\") pod \"eabc2178-31e6-49e7-a78c-58dd769c751c\" (UID: \"eabc2178-31e6-49e7-a78c-58dd769c751c\") " Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.095261 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eabc2178-31e6-49e7-a78c-58dd769c751c-kube-api-access-5f8jf" (OuterVolumeSpecName: "kube-api-access-5f8jf") pod "eabc2178-31e6-49e7-a78c-58dd769c751c" (UID: "eabc2178-31e6-49e7-a78c-58dd769c751c"). InnerVolumeSpecName "kube-api-access-5f8jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.106945 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eabc2178-31e6-49e7-a78c-58dd769c751c" (UID: "eabc2178-31e6-49e7-a78c-58dd769c751c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.107343 4904 scope.go:117] "RemoveContainer" containerID="b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9" Dec 05 20:33:46 crc kubenswrapper[4904]: E1205 20:33:46.108842 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9\": container with ID starting with b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9 not found: ID does not exist" containerID="b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.108886 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9"} err="failed to get container status \"b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9\": rpc error: code = NotFound desc = could not find container \"b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9\": container with ID starting with b5be99277c88811d304bbae5405f89cf3c2044e5d84e95d8345e25557ca499b9 not found: ID does not exist" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.108913 4904 scope.go:117] "RemoveContainer" containerID="249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea" Dec 05 20:33:46 crc kubenswrapper[4904]: E1205 20:33:46.109258 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea\": container with ID starting with 249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea not found: ID does not exist" containerID="249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.109272 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea"} err="failed to get container status \"249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea\": rpc error: code = NotFound desc = could not find container \"249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea\": container with ID starting with 249c27d8712414588823e8bbf71d00a39ef7a2665c4aa1924a6d570149efd9ea not found: ID does not exist" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.157578 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.192424 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f8jf\" (UniqueName: \"kubernetes.io/projected/eabc2178-31e6-49e7-a78c-58dd769c751c-kube-api-access-5f8jf\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.192460 4904 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.206297 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-config" (OuterVolumeSpecName: "config") pod "eabc2178-31e6-49e7-a78c-58dd769c751c" (UID: "eabc2178-31e6-49e7-a78c-58dd769c751c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.212204 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eabc2178-31e6-49e7-a78c-58dd769c751c" (UID: "eabc2178-31e6-49e7-a78c-58dd769c751c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.224268 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eabc2178-31e6-49e7-a78c-58dd769c751c" (UID: "eabc2178-31e6-49e7-a78c-58dd769c751c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.295324 4904 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.295355 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.295366 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabc2178-31e6-49e7-a78c-58dd769c751c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.376763 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c784f7484-wz74b"] Dec 05 20:33:46 crc kubenswrapper[4904]: I1205 20:33:46.387925 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c784f7484-wz74b"] Dec 05 20:33:47 crc kubenswrapper[4904]: I1205 20:33:47.048576 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerStarted","Data":"edc7efab9b36738c5e164ee9cd56f4caf4a6ef75058894228d9d958b84a26d14"} Dec 05 20:33:47 crc kubenswrapper[4904]: I1205 20:33:47.352895 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 20:33:47 crc kubenswrapper[4904]: I1205 20:33:47.692038 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" path="/var/lib/kubelet/pods/eabc2178-31e6-49e7-a78c-58dd769c751c/volumes" Dec 05 20:33:48 crc kubenswrapper[4904]: I1205 20:33:48.013383 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:48 crc kubenswrapper[4904]: I1205 20:33:48.087002 4904 generic.go:334] "Generic (PLEG): container finished" podID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" containerID="ba7b4df22e89a35b4c6557fd502623f1c46ac646b4218b8e0d233117e64ff38e" exitCode=0 Dec 05 20:33:48 crc kubenswrapper[4904]: I1205 20:33:48.087067 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gtfpr" event={"ID":"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b","Type":"ContainerDied","Data":"ba7b4df22e89a35b4c6557fd502623f1c46ac646b4218b8e0d233117e64ff38e"} Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.019660 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.095471 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fdc8d759-tm87z"] Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.095711 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="dnsmasq-dns" containerID="cri-o://d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244" gracePeriod=10 Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.114195 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerStarted","Data":"d7b855b72762834437d85a3dc80bc80e204e313e62a09e8a31139301653917ca"} Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.114444 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerStarted","Data":"2cc78c62fb8e1dcc2158b1556b61f95acf4de6f3de4b5a4d4c22cffeaba7f810"} Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.294894 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.309706 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.708464 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.788476 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gtfpr" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.794865 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889189 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sbvz\" (UniqueName: \"kubernetes.io/projected/ef666e4f-797c-4868-b3b4-2b834851d840-kube-api-access-9sbvz\") pod \"ef666e4f-797c-4868-b3b4-2b834851d840\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889241 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-config\") pod \"ef666e4f-797c-4868-b3b4-2b834851d840\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889261 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdvjm\" (UniqueName: \"kubernetes.io/projected/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-kube-api-access-hdvjm\") pod \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889298 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-swift-storage-0\") pod \"ef666e4f-797c-4868-b3b4-2b834851d840\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889326 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-sb\") pod \"ef666e4f-797c-4868-b3b4-2b834851d840\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889386 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-db-sync-config-data\") pod \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889406 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-nb\") pod \"ef666e4f-797c-4868-b3b4-2b834851d840\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889427 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-svc\") pod \"ef666e4f-797c-4868-b3b4-2b834851d840\" (UID: \"ef666e4f-797c-4868-b3b4-2b834851d840\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889446 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-combined-ca-bundle\") pod \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.889474 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-config-data\") pod \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\" (UID: \"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b\") " Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.914112 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef666e4f-797c-4868-b3b4-2b834851d840-kube-api-access-9sbvz" (OuterVolumeSpecName: "kube-api-access-9sbvz") pod "ef666e4f-797c-4868-b3b4-2b834851d840" (UID: "ef666e4f-797c-4868-b3b4-2b834851d840"). InnerVolumeSpecName "kube-api-access-9sbvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.914208 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-kube-api-access-hdvjm" (OuterVolumeSpecName: "kube-api-access-hdvjm") pod "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" (UID: "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b"). InnerVolumeSpecName "kube-api-access-hdvjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.915853 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.920229 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" (UID: "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.949787 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" (UID: "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.964239 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef666e4f-797c-4868-b3b4-2b834851d840" (UID: "ef666e4f-797c-4868-b3b4-2b834851d840"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.977654 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-config" (OuterVolumeSpecName: "config") pod "ef666e4f-797c-4868-b3b4-2b834851d840" (UID: "ef666e4f-797c-4868-b3b4-2b834851d840"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.979720 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ef666e4f-797c-4868-b3b4-2b834851d840" (UID: "ef666e4f-797c-4868-b3b4-2b834851d840"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.979792 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef666e4f-797c-4868-b3b4-2b834851d840" (UID: "ef666e4f-797c-4868-b3b4-2b834851d840"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993114 4904 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993153 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993167 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993180 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sbvz\" (UniqueName: \"kubernetes.io/projected/ef666e4f-797c-4868-b3b4-2b834851d840-kube-api-access-9sbvz\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993197 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993210 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdvjm\" (UniqueName: \"kubernetes.io/projected/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-kube-api-access-hdvjm\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993222 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.993233 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:49 crc kubenswrapper[4904]: I1205 20:33:49.998959 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef666e4f-797c-4868-b3b4-2b834851d840" (UID: "ef666e4f-797c-4868-b3b4-2b834851d840"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.002207 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-config-data" (OuterVolumeSpecName: "config-data") pod "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" (UID: "5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.094429 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.094464 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666e4f-797c-4868-b3b4-2b834851d840-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.124191 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gtfpr" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.124198 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gtfpr" event={"ID":"5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b","Type":"ContainerDied","Data":"d9300f073ef334c06995edf5d98d302ca5d641357ac18a7c9a0fcf737d59bbb1"} Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.124277 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9300f073ef334c06995edf5d98d302ca5d641357ac18a7c9a0fcf737d59bbb1" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.126708 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerStarted","Data":"3e2fe9a254041522da6aeaf973be20e4cb9764e395d06333b40697f5200a4dc4"} Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.128708 4904 generic.go:334] "Generic (PLEG): container finished" podID="ef666e4f-797c-4868-b3b4-2b834851d840" containerID="d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244" exitCode=0 Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.128755 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.128757 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" event={"ID":"ef666e4f-797c-4868-b3b4-2b834851d840","Type":"ContainerDied","Data":"d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244"} Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.128808 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" event={"ID":"ef666e4f-797c-4868-b3b4-2b834851d840","Type":"ContainerDied","Data":"cdf349082d92a762d62ec1d3c8f023e4c8c7bd7acbc26c9b2c38e45b88d08111"} Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.128831 4904 scope.go:117] "RemoveContainer" containerID="d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.139643 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.157597 4904 scope.go:117] "RemoveContainer" containerID="2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.210014 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fdc8d759-tm87z"] Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.211305 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fdc8d759-tm87z"] Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.224766 4904 scope.go:117] "RemoveContainer" containerID="d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244" Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.225302 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244\": container with ID starting with d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244 not found: ID does not exist" containerID="d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.225355 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244"} err="failed to get container status \"d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244\": rpc error: code = NotFound desc = could not find container \"d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244\": container with ID starting with d8298e8d4f535eb1c80fee20a54265920e6dfe2ab1bb870af5013230452f5244 not found: ID does not exist" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.225383 4904 scope.go:117] "RemoveContainer" containerID="2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5" Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.225729 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5\": container with ID starting with 2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5 not found: ID does not exist" containerID="2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.225797 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5"} err="failed to get container status \"2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5\": rpc error: code = NotFound desc = could not find container \"2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5\": container with ID starting with 2705918eb5aa450c2a5dbe27b4aa1860d764800bc30e280fbaa8ac9e0ab701d5 not found: ID does not exist" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.227382 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.609036 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cdd789cc5-m8v2q"] Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.611116 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-api" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611138 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-api" Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.611155 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="init" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611163 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="init" Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.611177 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="dnsmasq-dns" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611185 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="dnsmasq-dns" Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.611221 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-httpd" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611228 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-httpd" Dec 05 20:33:50 crc kubenswrapper[4904]: E1205 20:33:50.611268 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" containerName="glance-db-sync" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611274 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" containerName="glance-db-sync" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611723 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" containerName="glance-db-sync" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611768 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="dnsmasq-dns" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611803 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-api" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.611834 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="eabc2178-31e6-49e7-a78c-58dd769c751c" containerName="neutron-httpd" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.613609 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.688322 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdd789cc5-m8v2q"] Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.724479 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-svc\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.724545 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.724613 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.724638 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-config\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.724661 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dns8\" (UniqueName: \"kubernetes.io/projected/baa0cf87-8601-4807-8453-0809354b472e-kube-api-access-5dns8\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.724688 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-swift-storage-0\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.728186 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6867ddbddb-4lg6w" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.826501 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.826556 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-config\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.826581 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dns8\" (UniqueName: \"kubernetes.io/projected/baa0cf87-8601-4807-8453-0809354b472e-kube-api-access-5dns8\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.826609 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-swift-storage-0\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.826675 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-svc\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.826705 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.828765 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-swift-storage-0\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.828895 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-config\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.829041 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.829695 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-svc\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.829707 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.852605 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dns8\" (UniqueName: \"kubernetes.io/projected/baa0cf87-8601-4807-8453-0809354b472e-kube-api-access-5dns8\") pod \"dnsmasq-dns-5cdd789cc5-m8v2q\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:50 crc kubenswrapper[4904]: I1205 20:33:50.954507 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.486449 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.488388 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.494588 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwqht" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.494861 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.495082 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.518673 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.589097 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdd789cc5-m8v2q"] Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645091 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645130 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh4qm\" (UniqueName: \"kubernetes.io/projected/861d7b17-4130-4c11-9652-63a366cddfe2-kube-api-access-mh4qm\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645159 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-scripts\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645188 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645238 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-logs\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645253 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.645324 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-config-data\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.719523 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" path="/var/lib/kubelet/pods/ef666e4f-797c-4868-b3b4-2b834851d840/volumes" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.749921 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.749988 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh4qm\" (UniqueName: \"kubernetes.io/projected/861d7b17-4130-4c11-9652-63a366cddfe2-kube-api-access-mh4qm\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.750024 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-scripts\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.750129 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.750287 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.750497 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-logs\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.750638 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-config-data\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.752258 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.752324 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.753623 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-logs\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.760700 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.770698 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-scripts\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.771199 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-config-data\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.772875 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh4qm\" (UniqueName: \"kubernetes.io/projected/861d7b17-4130-4c11-9652-63a366cddfe2-kube-api-access-mh4qm\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.808941 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.809702 4904 scope.go:117] "RemoveContainer" containerID="a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.810131 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.825346 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.877874 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.879686 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.893331 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 20:33:51 crc kubenswrapper[4904]: I1205 20:33:51.934121 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057126 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057174 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057237 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbnp\" (UniqueName: \"kubernetes.io/projected/e63d9d54-2803-4816-a026-0cc8e063070a-kube-api-access-9vbnp\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057257 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-logs\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057286 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057336 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.057354 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.129594 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.160994 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161034 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161117 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbnp\" (UniqueName: \"kubernetes.io/projected/e63d9d54-2803-4816-a026-0cc8e063070a-kube-api-access-9vbnp\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161137 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-logs\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161166 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161219 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161238 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.161637 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.163382 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-logs\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.164484 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.165139 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.174535 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.182875 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.185697 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbnp\" (UniqueName: \"kubernetes.io/projected/e63d9d54-2803-4816-a026-0cc8e063070a-kube-api-access-9vbnp\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.213831 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" event={"ID":"baa0cf87-8601-4807-8453-0809354b472e","Type":"ContainerStarted","Data":"a32e393240a4619023145d5ded3324ab74b02bab14aa78754acd6cd290fc7d9b"} Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.227454 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.518562 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.793554 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:52 crc kubenswrapper[4904]: I1205 20:33:52.926474 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="dadf2169-1b54-4e50-adff-71504b526259" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.211107 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.241666 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerStarted","Data":"a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04"} Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.250101 4904 generic.go:334] "Generic (PLEG): container finished" podID="baa0cf87-8601-4807-8453-0809354b472e" containerID="f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5" exitCode=0 Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.250191 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" event={"ID":"baa0cf87-8601-4807-8453-0809354b472e","Type":"ContainerDied","Data":"f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5"} Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.268264 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerStarted","Data":"472ef7e583de871c1a0f30ae6093887d3a92927a3167df16e72b3fb9c3cd5997"} Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.270116 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.277232 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"861d7b17-4130-4c11-9652-63a366cddfe2","Type":"ContainerStarted","Data":"dde6380ba39ffb0a0023b79e3039369ef38cd67bb7d37d98c6bc9e947d464365"} Dec 05 20:33:53 crc kubenswrapper[4904]: I1205 20:33:53.324945 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.075763654 podStartE2EDuration="8.324924566s" podCreationTimestamp="2025-12-05 20:33:45 +0000 UTC" firstStartedPulling="2025-12-05 20:33:46.163127291 +0000 UTC m=+1324.974343410" lastFinishedPulling="2025-12-05 20:33:51.412288213 +0000 UTC m=+1330.223504322" observedRunningTime="2025-12-05 20:33:53.318426764 +0000 UTC m=+1332.129642883" watchObservedRunningTime="2025-12-05 20:33:53.324924566 +0000 UTC m=+1332.136140665" Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.307498 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" event={"ID":"baa0cf87-8601-4807-8453-0809354b472e","Type":"ContainerStarted","Data":"aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7"} Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.307909 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.309634 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e63d9d54-2803-4816-a026-0cc8e063070a","Type":"ContainerStarted","Data":"90d499e62ae9f2f13feb1d0b09553481bbc8d9090d700fe4990f707c0a172471"} Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.312271 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"861d7b17-4130-4c11-9652-63a366cddfe2","Type":"ContainerStarted","Data":"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762"} Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.357812 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" podStartSLOduration=4.357792717 podStartE2EDuration="4.357792717s" podCreationTimestamp="2025-12-05 20:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:54.340703947 +0000 UTC m=+1333.151920066" watchObservedRunningTime="2025-12-05 20:33:54.357792717 +0000 UTC m=+1333.169008826" Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.538083 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55fdc8d759-tm87z" podUID="ef666e4f-797c-4868-b3b4-2b834851d840" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.170:5353: i/o timeout" Dec 05 20:33:54 crc kubenswrapper[4904]: I1205 20:33:54.930284 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="dadf2169-1b54-4e50-adff-71504b526259" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:33:55 crc kubenswrapper[4904]: I1205 20:33:55.269256 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-597c5664fd-tbwsr" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:33:55 crc kubenswrapper[4904]: I1205 20:33:55.336516 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e63d9d54-2803-4816-a026-0cc8e063070a","Type":"ContainerStarted","Data":"8088a2ec4e4cf6a598b0460f0cf41e7b3d69bcf81ba4d96598981afeb852e1e6"} Dec 05 20:33:55 crc kubenswrapper[4904]: I1205 20:33:55.340955 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"861d7b17-4130-4c11-9652-63a366cddfe2","Type":"ContainerStarted","Data":"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438"} Dec 05 20:33:55 crc kubenswrapper[4904]: I1205 20:33:55.380085 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.38005172 podStartE2EDuration="5.38005172s" podCreationTimestamp="2025-12-05 20:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:55.366282707 +0000 UTC m=+1334.177498836" watchObservedRunningTime="2025-12-05 20:33:55.38005172 +0000 UTC m=+1334.191267819" Dec 05 20:33:56 crc kubenswrapper[4904]: I1205 20:33:56.014703 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:56 crc kubenswrapper[4904]: I1205 20:33:56.058366 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:56 crc kubenswrapper[4904]: I1205 20:33:56.224026 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c6d7bc49b-2f688" Dec 05 20:33:56 crc kubenswrapper[4904]: I1205 20:33:56.397811 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e63d9d54-2803-4816-a026-0cc8e063070a","Type":"ContainerStarted","Data":"816805cfd340c1cf8b878cd78040343e7777948ae5d5b9440c0531ebc87fd51a"} Dec 05 20:33:56 crc kubenswrapper[4904]: I1205 20:33:56.458628 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.458608214 podStartE2EDuration="6.458608214s" podCreationTimestamp="2025-12-05 20:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:33:56.444157664 +0000 UTC m=+1335.255373793" watchObservedRunningTime="2025-12-05 20:33:56.458608214 +0000 UTC m=+1335.269824323" Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.060275 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.119261 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66d5cfdcdd-d59d6" podUID="f786b88c-ea37-4a02-bdd1-1f9feca9993a" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.179:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.151300 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.406452 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-log" containerID="cri-o://42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762" gracePeriod=30 Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.407594 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-httpd" containerID="cri-o://b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438" gracePeriod=30 Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.663459 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.664017 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api-log" containerID="cri-o://c4d13fcb4fee0443ab536165c27f431bedb1abd70ba2af7e23bb12d9cc88dea9" gracePeriod=30 Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.664383 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api" containerID="cri-o://cd0939fb41199c51ecd8b69fd3b98d8721c30967fb755ec989c72bb7b6bb72d7" gracePeriod=30 Dec 05 20:33:57 crc kubenswrapper[4904]: I1205 20:33:57.910433 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-64fdbdb744-gdc8l" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.282988 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.419387 4904 generic.go:334] "Generic (PLEG): container finished" podID="3420fce6-9733-45e2-8167-a91c39e372af" containerID="c4d13fcb4fee0443ab536165c27f431bedb1abd70ba2af7e23bb12d9cc88dea9" exitCode=143 Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.419676 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3420fce6-9733-45e2-8167-a91c39e372af","Type":"ContainerDied","Data":"c4d13fcb4fee0443ab536165c27f431bedb1abd70ba2af7e23bb12d9cc88dea9"} Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422277 4904 generic.go:334] "Generic (PLEG): container finished" podID="861d7b17-4130-4c11-9652-63a366cddfe2" containerID="b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438" exitCode=0 Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422300 4904 generic.go:334] "Generic (PLEG): container finished" podID="861d7b17-4130-4c11-9652-63a366cddfe2" containerID="42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762" exitCode=143 Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422321 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"861d7b17-4130-4c11-9652-63a366cddfe2","Type":"ContainerDied","Data":"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438"} Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422361 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422380 4904 scope.go:117] "RemoveContainer" containerID="b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422365 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"861d7b17-4130-4c11-9652-63a366cddfe2","Type":"ContainerDied","Data":"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762"} Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422415 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"861d7b17-4130-4c11-9652-63a366cddfe2","Type":"ContainerDied","Data":"dde6380ba39ffb0a0023b79e3039369ef38cd67bb7d37d98c6bc9e947d464365"} Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422476 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-log" containerID="cri-o://8088a2ec4e4cf6a598b0460f0cf41e7b3d69bcf81ba4d96598981afeb852e1e6" gracePeriod=30 Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.422550 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-httpd" containerID="cri-o://816805cfd340c1cf8b878cd78040343e7777948ae5d5b9440c0531ebc87fd51a" gracePeriod=30 Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.445828 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-combined-ca-bundle\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.445893 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-httpd-run\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446091 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-scripts\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446143 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-logs\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446193 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh4qm\" (UniqueName: \"kubernetes.io/projected/861d7b17-4130-4c11-9652-63a366cddfe2-kube-api-access-mh4qm\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446276 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-config-data\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446297 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"861d7b17-4130-4c11-9652-63a366cddfe2\" (UID: \"861d7b17-4130-4c11-9652-63a366cddfe2\") " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446734 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.446755 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-logs" (OuterVolumeSpecName: "logs") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.447013 4904 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.447026 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/861d7b17-4130-4c11-9652-63a366cddfe2-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.459253 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.459336 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-scripts" (OuterVolumeSpecName: "scripts") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.461273 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861d7b17-4130-4c11-9652-63a366cddfe2-kube-api-access-mh4qm" (OuterVolumeSpecName: "kube-api-access-mh4qm") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "kube-api-access-mh4qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.470222 4904 scope.go:117] "RemoveContainer" containerID="42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.489040 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.502917 4904 scope.go:117] "RemoveContainer" containerID="b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438" Dec 05 20:33:58 crc kubenswrapper[4904]: E1205 20:33:58.504129 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438\": container with ID starting with b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438 not found: ID does not exist" containerID="b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.504187 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438"} err="failed to get container status \"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438\": rpc error: code = NotFound desc = could not find container \"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438\": container with ID starting with b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438 not found: ID does not exist" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.504215 4904 scope.go:117] "RemoveContainer" containerID="42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762" Dec 05 20:33:58 crc kubenswrapper[4904]: E1205 20:33:58.504679 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762\": container with ID starting with 42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762 not found: ID does not exist" containerID="42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.504700 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762"} err="failed to get container status \"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762\": rpc error: code = NotFound desc = could not find container \"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762\": container with ID starting with 42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762 not found: ID does not exist" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.504741 4904 scope.go:117] "RemoveContainer" containerID="b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.505048 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438"} err="failed to get container status \"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438\": rpc error: code = NotFound desc = could not find container \"b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438\": container with ID starting with b08e5522b00e3844978590986ee5622070d9f8956926a217bb369de8385b8438 not found: ID does not exist" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.505087 4904 scope.go:117] "RemoveContainer" containerID="42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.505288 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762"} err="failed to get container status \"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762\": rpc error: code = NotFound desc = could not find container \"42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762\": container with ID starting with 42e316276bc97ab81af6f6b077da1a50fb3e679733882dfb5ea4776a328a2762 not found: ID does not exist" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.516917 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-config-data" (OuterVolumeSpecName: "config-data") pod "861d7b17-4130-4c11-9652-63a366cddfe2" (UID: "861d7b17-4130-4c11-9652-63a366cddfe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.549420 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.549460 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh4qm\" (UniqueName: \"kubernetes.io/projected/861d7b17-4130-4c11-9652-63a366cddfe2-kube-api-access-mh4qm\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.549474 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.549505 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.549518 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861d7b17-4130-4c11-9652-63a366cddfe2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.575118 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.651028 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.770435 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.811848 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.827023 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:58 crc kubenswrapper[4904]: E1205 20:33:58.827464 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-httpd" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.827494 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-httpd" Dec 05 20:33:58 crc kubenswrapper[4904]: E1205 20:33:58.827530 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-log" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.827540 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-log" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.827761 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-httpd" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.827802 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" containerName="glance-log" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.829141 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.831999 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.832209 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.836891 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.931548 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990567 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-logs\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990617 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990749 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990788 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctp7k\" (UniqueName: \"kubernetes.io/projected/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-kube-api-access-ctp7k\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990835 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990923 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.990991 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:58 crc kubenswrapper[4904]: I1205 20:33:58.991028 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094359 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094744 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094800 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094835 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-logs\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094852 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094970 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.094999 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctp7k\" (UniqueName: \"kubernetes.io/projected/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-kube-api-access-ctp7k\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.095043 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.095567 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.096033 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-logs\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.096234 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.104360 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.105204 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.105635 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.109143 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.119879 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctp7k\" (UniqueName: \"kubernetes.io/projected/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-kube-api-access-ctp7k\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.154562 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.458718 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.473019 4904 generic.go:334] "Generic (PLEG): container finished" podID="3420fce6-9733-45e2-8167-a91c39e372af" containerID="cd0939fb41199c51ecd8b69fd3b98d8721c30967fb755ec989c72bb7b6bb72d7" exitCode=0 Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.473119 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3420fce6-9733-45e2-8167-a91c39e372af","Type":"ContainerDied","Data":"cd0939fb41199c51ecd8b69fd3b98d8721c30967fb755ec989c72bb7b6bb72d7"} Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.509778 4904 generic.go:334] "Generic (PLEG): container finished" podID="e63d9d54-2803-4816-a026-0cc8e063070a" containerID="816805cfd340c1cf8b878cd78040343e7777948ae5d5b9440c0531ebc87fd51a" exitCode=0 Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.509816 4904 generic.go:334] "Generic (PLEG): container finished" podID="e63d9d54-2803-4816-a026-0cc8e063070a" containerID="8088a2ec4e4cf6a598b0460f0cf41e7b3d69bcf81ba4d96598981afeb852e1e6" exitCode=143 Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.509878 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e63d9d54-2803-4816-a026-0cc8e063070a","Type":"ContainerDied","Data":"816805cfd340c1cf8b878cd78040343e7777948ae5d5b9440c0531ebc87fd51a"} Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.509908 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e63d9d54-2803-4816-a026-0cc8e063070a","Type":"ContainerDied","Data":"8088a2ec4e4cf6a598b0460f0cf41e7b3d69bcf81ba4d96598981afeb852e1e6"} Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.718882 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861d7b17-4130-4c11-9652-63a366cddfe2" path="/var/lib/kubelet/pods/861d7b17-4130-4c11-9652-63a366cddfe2/volumes" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.769181 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.822341 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.869751 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66d5cfdcdd-d59d6" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917125 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-httpd-run\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917197 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-combined-ca-bundle\") pod \"3420fce6-9733-45e2-8167-a91c39e372af\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917239 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vbnp\" (UniqueName: \"kubernetes.io/projected/e63d9d54-2803-4816-a026-0cc8e063070a-kube-api-access-9vbnp\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917262 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-config-data\") pod \"3420fce6-9733-45e2-8167-a91c39e372af\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917338 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-scripts\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917414 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-logs\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917464 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-combined-ca-bundle\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917492 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-config-data\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917550 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-custom-prometheus-ca\") pod \"3420fce6-9733-45e2-8167-a91c39e372af\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917582 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420fce6-9733-45e2-8167-a91c39e372af-logs\") pod \"3420fce6-9733-45e2-8167-a91c39e372af\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917757 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e63d9d54-2803-4816-a026-0cc8e063070a\" (UID: \"e63d9d54-2803-4816-a026-0cc8e063070a\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.917819 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmdb6\" (UniqueName: \"kubernetes.io/projected/3420fce6-9733-45e2-8167-a91c39e372af-kube-api-access-kmdb6\") pod \"3420fce6-9733-45e2-8167-a91c39e372af\" (UID: \"3420fce6-9733-45e2-8167-a91c39e372af\") " Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.918955 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-logs" (OuterVolumeSpecName: "logs") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.919252 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.934810 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3420fce6-9733-45e2-8167-a91c39e372af-logs" (OuterVolumeSpecName: "logs") pod "3420fce6-9733-45e2-8167-a91c39e372af" (UID: "3420fce6-9733-45e2-8167-a91c39e372af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.949636 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3420fce6-9733-45e2-8167-a91c39e372af-kube-api-access-kmdb6" (OuterVolumeSpecName: "kube-api-access-kmdb6") pod "3420fce6-9733-45e2-8167-a91c39e372af" (UID: "3420fce6-9733-45e2-8167-a91c39e372af"). InnerVolumeSpecName "kube-api-access-kmdb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.950130 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-scripts" (OuterVolumeSpecName: "scripts") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.953000 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.959113 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-597c5664fd-tbwsr"] Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.959405 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-597c5664fd-tbwsr" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api-log" containerID="cri-o://550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36" gracePeriod=30 Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.959686 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-597c5664fd-tbwsr" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api" containerID="cri-o://212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043" gracePeriod=30 Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.966562 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63d9d54-2803-4816-a026-0cc8e063070a-kube-api-access-9vbnp" (OuterVolumeSpecName: "kube-api-access-9vbnp") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "kube-api-access-9vbnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:59 crc kubenswrapper[4904]: I1205 20:33:59.981203 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3420fce6-9733-45e2-8167-a91c39e372af" (UID: "3420fce6-9733-45e2-8167-a91c39e372af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.004869 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3420fce6-9733-45e2-8167-a91c39e372af" (UID: "3420fce6-9733-45e2-8167-a91c39e372af"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.035913 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036078 4904 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036183 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420fce6-9733-45e2-8167-a91c39e372af-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036286 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036358 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmdb6\" (UniqueName: \"kubernetes.io/projected/3420fce6-9733-45e2-8167-a91c39e372af-kube-api-access-kmdb6\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036501 4904 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e63d9d54-2803-4816-a026-0cc8e063070a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036628 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036697 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vbnp\" (UniqueName: \"kubernetes.io/projected/e63d9d54-2803-4816-a026-0cc8e063070a-kube-api-access-9vbnp\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.036761 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.038223 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.059973 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-config-data" (OuterVolumeSpecName: "config-data") pod "e63d9d54-2803-4816-a026-0cc8e063070a" (UID: "e63d9d54-2803-4816-a026-0cc8e063070a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.074685 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.082227 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-config-data" (OuterVolumeSpecName: "config-data") pod "3420fce6-9733-45e2-8167-a91c39e372af" (UID: "3420fce6-9733-45e2-8167-a91c39e372af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.138976 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.139009 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420fce6-9733-45e2-8167-a91c39e372af-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.139020 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.139030 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63d9d54-2803-4816-a026-0cc8e063070a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.231513 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.671386 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.671557 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e63d9d54-2803-4816-a026-0cc8e063070a","Type":"ContainerDied","Data":"90d499e62ae9f2f13feb1d0b09553481bbc8d9090d700fe4990f707c0a172471"} Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.671784 4904 scope.go:117] "RemoveContainer" containerID="816805cfd340c1cf8b878cd78040343e7777948ae5d5b9440c0531ebc87fd51a" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.688669 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3420fce6-9733-45e2-8167-a91c39e372af","Type":"ContainerDied","Data":"93fc59475144f4440d0c606bb53dc644487806f3100cf2e603e498113ab787be"} Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.688757 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.718977 4904 generic.go:334] "Generic (PLEG): container finished" podID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerID="550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36" exitCode=143 Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.719044 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597c5664fd-tbwsr" event={"ID":"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317","Type":"ContainerDied","Data":"550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36"} Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.721243 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.730308 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6867ddbddb-4lg6w" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.730411 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.730811 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f","Type":"ContainerStarted","Data":"a96b12c6c6ddc703d50443da345bebdbb2a133c9409c0e8766a927915e8e5665"} Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.738293 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.743570 4904 scope.go:117] "RemoveContainer" containerID="8088a2ec4e4cf6a598b0460f0cf41e7b3d69bcf81ba4d96598981afeb852e1e6" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.754413 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.792866 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.819435 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: E1205 20:34:00.819834 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-httpd" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.819845 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-httpd" Dec 05 20:34:00 crc kubenswrapper[4904]: E1205 20:34:00.819859 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-log" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.819866 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-log" Dec 05 20:34:00 crc kubenswrapper[4904]: E1205 20:34:00.819893 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api-log" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.819899 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api-log" Dec 05 20:34:00 crc kubenswrapper[4904]: E1205 20:34:00.819920 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.819926 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.820098 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.820109 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api-log" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.820128 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-httpd" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.820139 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" containerName="glance-log" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.821109 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.837291 4904 scope.go:117] "RemoveContainer" containerID="cd0939fb41199c51ecd8b69fd3b98d8721c30967fb755ec989c72bb7b6bb72d7" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.837492 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.837700 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.899660 4904 scope.go:117] "RemoveContainer" containerID="c4d13fcb4fee0443ab536165c27f431bedb1abd70ba2af7e23bb12d9cc88dea9" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.904218 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.907839 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.912860 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.913152 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.913345 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.958498 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966033 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966128 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966179 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966214 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfcc\" (UniqueName: \"kubernetes.io/projected/1fae0189-1658-4a53-b223-22d70555b03d-kube-api-access-rnfcc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966305 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966354 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966371 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.966420 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-logs\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:00 crc kubenswrapper[4904]: I1205 20:34:00.970873 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.002133 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.067576 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658fd98679-9cr7m"] Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.067872 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerName="dnsmasq-dns" containerID="cri-o://3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b" gracePeriod=10 Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069473 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069514 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069546 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069580 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfcc\" (UniqueName: \"kubernetes.io/projected/1fae0189-1658-4a53-b223-22d70555b03d-kube-api-access-rnfcc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069622 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069647 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc5br\" (UniqueName: \"kubernetes.io/projected/f7be275a-3638-4279-a90b-8ad43e931ee6-kube-api-access-tc5br\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069700 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7be275a-3638-4279-a90b-8ad43e931ee6-logs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069735 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069777 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069802 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-logs\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069825 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.069929 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.070001 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.070026 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.070078 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-config-data\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.071581 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.075657 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-logs\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.076071 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.089123 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.092557 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.093558 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.109035 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfcc\" (UniqueName: \"kubernetes.io/projected/1fae0189-1658-4a53-b223-22d70555b03d-kube-api-access-rnfcc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.112837 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.114091 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.121599 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.121706 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.131451 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-v8kk6" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.131589 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.131659 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.172696 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-config-data\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.172745 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.172901 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.172921 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5br\" (UniqueName: \"kubernetes.io/projected/f7be275a-3638-4279-a90b-8ad43e931ee6-kube-api-access-tc5br\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.173096 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7be275a-3638-4279-a90b-8ad43e931ee6-logs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.173171 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.173715 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.175639 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7be275a-3638-4279-a90b-8ad43e931ee6-logs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.176278 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.182944 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.193413 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.197739 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.203735 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.207259 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7be275a-3638-4279-a90b-8ad43e931ee6-config-data\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.209845 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc5br\" (UniqueName: \"kubernetes.io/projected/f7be275a-3638-4279-a90b-8ad43e931ee6-kube-api-access-tc5br\") pod \"watcher-api-0\" (UID: \"f7be275a-3638-4279-a90b-8ad43e931ee6\") " pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.231882 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.278277 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a50daf-6f3c-405d-b047-12a11ac0b56b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.278383 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8a50daf-6f3c-405d-b047-12a11ac0b56b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.278426 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nctzs\" (UniqueName: \"kubernetes.io/projected/c8a50daf-6f3c-405d-b047-12a11ac0b56b-kube-api-access-nctzs\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.278469 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8a50daf-6f3c-405d-b047-12a11ac0b56b-openstack-config\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.380311 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a50daf-6f3c-405d-b047-12a11ac0b56b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.380383 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8a50daf-6f3c-405d-b047-12a11ac0b56b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.380416 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nctzs\" (UniqueName: \"kubernetes.io/projected/c8a50daf-6f3c-405d-b047-12a11ac0b56b-kube-api-access-nctzs\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.380453 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8a50daf-6f3c-405d-b047-12a11ac0b56b-openstack-config\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.382369 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8a50daf-6f3c-405d-b047-12a11ac0b56b-openstack-config\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.385937 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8a50daf-6f3c-405d-b047-12a11ac0b56b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.389397 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a50daf-6f3c-405d-b047-12a11ac0b56b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.410571 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nctzs\" (UniqueName: \"kubernetes.io/projected/c8a50daf-6f3c-405d-b047-12a11ac0b56b-kube-api-access-nctzs\") pod \"openstackclient\" (UID: \"c8a50daf-6f3c-405d-b047-12a11ac0b56b\") " pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.459168 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.685122 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.740544 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.746923 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3420fce6-9733-45e2-8167-a91c39e372af" path="/var/lib/kubelet/pods/3420fce6-9733-45e2-8167-a91c39e372af/volumes" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.747826 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63d9d54-2803-4816-a026-0cc8e063070a" path="/var/lib/kubelet/pods/e63d9d54-2803-4816-a026-0cc8e063070a/volumes" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.811475 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:01 crc kubenswrapper[4904]: E1205 20:34:01.824616 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04 is running failed: container process not found" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 20:34:01 crc kubenswrapper[4904]: E1205 20:34:01.825188 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04 is running failed: container process not found" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.837987 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 20:34:01 crc kubenswrapper[4904]: E1205 20:34:01.853411 4904 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04 is running failed: container process not found" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 20:34:01 crc kubenswrapper[4904]: E1205 20:34:01.853488 4904 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04 is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.884895 4904 generic.go:334] "Generic (PLEG): container finished" podID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerID="3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b" exitCode=0 Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.884957 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" event={"ID":"fab0d2a9-0213-464a-b0a9-3e7338d7f40d","Type":"ContainerDied","Data":"3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b"} Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.884990 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" event={"ID":"fab0d2a9-0213-464a-b0a9-3e7338d7f40d","Type":"ContainerDied","Data":"30c4e5830a294a7190d0a7efcb49f441a09ec0249d089a42981bf09c66ba8d29"} Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.885014 4904 scope.go:117] "RemoveContainer" containerID="3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.885133 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658fd98679-9cr7m" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.902116 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-nb\") pod \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.902218 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-sb\") pod \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.902250 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wld77\" (UniqueName: \"kubernetes.io/projected/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-kube-api-access-wld77\") pod \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.902341 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-svc\") pod \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.902426 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-config\") pod \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.902454 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-swift-storage-0\") pod \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\" (UID: \"fab0d2a9-0213-464a-b0a9-3e7338d7f40d\") " Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.909482 4904 generic.go:334] "Generic (PLEG): container finished" podID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" exitCode=1 Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.909577 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerDied","Data":"a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04"} Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.910566 4904 scope.go:117] "RemoveContainer" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" Dec 05 20:34:01 crc kubenswrapper[4904]: E1205 20:34:01.910889 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(28e509d6-d15b-44e6-9afa-05a347c2a7a5)\"" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.919260 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-kube-api-access-wld77" (OuterVolumeSpecName: "kube-api-access-wld77") pod "fab0d2a9-0213-464a-b0a9-3e7338d7f40d" (UID: "fab0d2a9-0213-464a-b0a9-3e7338d7f40d"). InnerVolumeSpecName "kube-api-access-wld77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.947461 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f","Type":"ContainerStarted","Data":"7110bc0044761f07c1e167671039cc00b558f91170b014d1d3bf9cff6053ce44"} Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.982692 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fab0d2a9-0213-464a-b0a9-3e7338d7f40d" (UID: "fab0d2a9-0213-464a-b0a9-3e7338d7f40d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.986510 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fab0d2a9-0213-464a-b0a9-3e7338d7f40d" (UID: "fab0d2a9-0213-464a-b0a9-3e7338d7f40d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.994853 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-config" (OuterVolumeSpecName: "config") pod "fab0d2a9-0213-464a-b0a9-3e7338d7f40d" (UID: "fab0d2a9-0213-464a-b0a9-3e7338d7f40d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:01 crc kubenswrapper[4904]: I1205 20:34:01.996129 4904 scope.go:117] "RemoveContainer" containerID="5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.006004 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.006029 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.006039 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.006049 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wld77\" (UniqueName: \"kubernetes.io/projected/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-kube-api-access-wld77\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.028417 4904 scope.go:117] "RemoveContainer" containerID="3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b" Dec 05 20:34:02 crc kubenswrapper[4904]: E1205 20:34:02.029442 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b\": container with ID starting with 3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b not found: ID does not exist" containerID="3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.029482 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b"} err="failed to get container status \"3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b\": rpc error: code = NotFound desc = could not find container \"3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b\": container with ID starting with 3e1426e49827abe7d0fa40fcc9bf0c9463a37dedfc122c3c212bb7ab9b77088b not found: ID does not exist" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.029510 4904 scope.go:117] "RemoveContainer" containerID="5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5" Dec 05 20:34:02 crc kubenswrapper[4904]: E1205 20:34:02.029786 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5\": container with ID starting with 5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5 not found: ID does not exist" containerID="5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.029816 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5"} err="failed to get container status \"5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5\": rpc error: code = NotFound desc = could not find container \"5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5\": container with ID starting with 5b956ca2d2422268ca9a210f5efdadc3fa55be15dccc8c6b3995d66f340fe3d5 not found: ID does not exist" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.029835 4904 scope.go:117] "RemoveContainer" containerID="a00259f9a634df7782c63e78e4add8b555e20a36adcc54a889e7bf30dc526538" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.031468 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fab0d2a9-0213-464a-b0a9-3e7338d7f40d" (UID: "fab0d2a9-0213-464a-b0a9-3e7338d7f40d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.032587 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fab0d2a9-0213-464a-b0a9-3e7338d7f40d" (UID: "fab0d2a9-0213-464a-b0a9-3e7338d7f40d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.108345 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.108378 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fab0d2a9-0213-464a-b0a9-3e7338d7f40d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.212882 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.265025 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658fd98679-9cr7m"] Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.308672 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658fd98679-9cr7m"] Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.441042 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.484877 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.733611 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.856745 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfvqt\" (UniqueName: \"kubernetes.io/projected/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-kube-api-access-dfvqt\") pod \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.856849 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-logs\") pod \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.856928 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-combined-ca-bundle\") pod \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.856961 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data\") pod \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.857010 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data-custom\") pod \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\" (UID: \"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317\") " Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.858338 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-logs" (OuterVolumeSpecName: "logs") pod "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" (UID: "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.865224 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-kube-api-access-dfvqt" (OuterVolumeSpecName: "kube-api-access-dfvqt") pod "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" (UID: "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317"). InnerVolumeSpecName "kube-api-access-dfvqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.866534 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" (UID: "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.914535 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" (UID: "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.958958 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.958995 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.959005 4904 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.959014 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfvqt\" (UniqueName: \"kubernetes.io/projected/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-kube-api-access-dfvqt\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.984251 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data" (OuterVolumeSpecName: "config-data") pod "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" (UID: "7fa79fda-f8c4-4866-8a82-5e2bc4cdb317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.988354 4904 generic.go:334] "Generic (PLEG): container finished" podID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerID="212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043" exitCode=0 Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.988430 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597c5664fd-tbwsr" event={"ID":"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317","Type":"ContainerDied","Data":"212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043"} Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.988462 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597c5664fd-tbwsr" event={"ID":"7fa79fda-f8c4-4866-8a82-5e2bc4cdb317","Type":"ContainerDied","Data":"87a3ad10ba7e7138a50f0020c99c1d0a819946a80561bcf8040af52ecf079ead"} Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.988479 4904 scope.go:117] "RemoveContainer" containerID="212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043" Dec 05 20:34:02 crc kubenswrapper[4904]: I1205 20:34:02.988585 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597c5664fd-tbwsr" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.022451 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f","Type":"ContainerStarted","Data":"30bfd9e811f59d4ab51bae052f4be94586035087eb09c057dd12686c5517ef4b"} Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.030352 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f7be275a-3638-4279-a90b-8ad43e931ee6","Type":"ContainerStarted","Data":"602ca58511d24cf461df1fd84d1ef3eea418cc31153a50cc3ac7dabb9032197e"} Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.030397 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f7be275a-3638-4279-a90b-8ad43e931ee6","Type":"ContainerStarted","Data":"b954d02c9b3209d1258c156c8d1fa361df7ff53f3f320fa3ec7dc3647d57fd1b"} Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.030409 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f7be275a-3638-4279-a90b-8ad43e931ee6","Type":"ContainerStarted","Data":"81351d28d25657ff07c5e006e5c03c82f0d6a2e904a94584ae693930c402b0b6"} Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.031492 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.037683 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fae0189-1658-4a53-b223-22d70555b03d","Type":"ContainerStarted","Data":"b0ba2cce827bdf96a9ffc160ea65a297617f3a099e772ac84ef8a94f67b9875b"} Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.042398 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c8a50daf-6f3c-405d-b047-12a11ac0b56b","Type":"ContainerStarted","Data":"aa944f7c893a42992e23884e74b57bdc4da7c58501a7925a5a2f33eb50eeec76"} Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.045401 4904 scope.go:117] "RemoveContainer" containerID="550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.047426 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.047408064 podStartE2EDuration="5.047408064s" podCreationTimestamp="2025-12-05 20:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:03.045677259 +0000 UTC m=+1341.856893378" watchObservedRunningTime="2025-12-05 20:34:03.047408064 +0000 UTC m=+1341.858624183" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.061011 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.077176 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.077156307 podStartE2EDuration="3.077156307s" podCreationTimestamp="2025-12-05 20:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:03.069146016 +0000 UTC m=+1341.880362135" watchObservedRunningTime="2025-12-05 20:34:03.077156307 +0000 UTC m=+1341.888372416" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.105935 4904 scope.go:117] "RemoveContainer" containerID="212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043" Dec 05 20:34:03 crc kubenswrapper[4904]: E1205 20:34:03.106462 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043\": container with ID starting with 212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043 not found: ID does not exist" containerID="212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.106528 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043"} err="failed to get container status \"212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043\": rpc error: code = NotFound desc = could not find container \"212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043\": container with ID starting with 212d16e294e79b973fc8cd25119302971c0695123c34cad8dcb555eef0b2d043 not found: ID does not exist" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.106560 4904 scope.go:117] "RemoveContainer" containerID="550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36" Dec 05 20:34:03 crc kubenswrapper[4904]: E1205 20:34:03.108090 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36\": container with ID starting with 550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36 not found: ID does not exist" containerID="550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.108116 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36"} err="failed to get container status \"550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36\": rpc error: code = NotFound desc = could not find container \"550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36\": container with ID starting with 550efeb669df02f7db7cd10983dc0bcf775f15ed4def68be1b3dbd72322cab36 not found: ID does not exist" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.110149 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-597c5664fd-tbwsr"] Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.118429 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-597c5664fd-tbwsr"] Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.694037 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" path="/var/lib/kubelet/pods/7fa79fda-f8c4-4866-8a82-5e2bc4cdb317/volumes" Dec 05 20:34:03 crc kubenswrapper[4904]: I1205 20:34:03.695082 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" path="/var/lib/kubelet/pods/fab0d2a9-0213-464a-b0a9-3e7338d7f40d/volumes" Dec 05 20:34:04 crc kubenswrapper[4904]: I1205 20:34:04.083691 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fae0189-1658-4a53-b223-22d70555b03d","Type":"ContainerStarted","Data":"a359a51682bb749e3d6325a7c7fcc8f67d2298823c165d3a1925ae9b0ab4c6f3"} Dec 05 20:34:04 crc kubenswrapper[4904]: I1205 20:34:04.083750 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fae0189-1658-4a53-b223-22d70555b03d","Type":"ContainerStarted","Data":"f897710093cc220d9271b40416c5886d5c7ac2b62ceb1bc144595f0eea4c5a94"} Dec 05 20:34:04 crc kubenswrapper[4904]: I1205 20:34:04.130650 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.130626111 podStartE2EDuration="4.130626111s" podCreationTimestamp="2025-12-05 20:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:04.121934473 +0000 UTC m=+1342.933150592" watchObservedRunningTime="2025-12-05 20:34:04.130626111 +0000 UTC m=+1342.941842220" Dec 05 20:34:04 crc kubenswrapper[4904]: I1205 20:34:04.295344 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.177:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:34:04 crc kubenswrapper[4904]: I1205 20:34:04.295985 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="3420fce6-9733-45e2-8167-a91c39e372af" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.177:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:34:05 crc kubenswrapper[4904]: I1205 20:34:05.099306 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:34:05 crc kubenswrapper[4904]: I1205 20:34:05.563451 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.232556 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.801364 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866005 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-secret-key\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866090 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-combined-ca-bundle\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866183 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjr7\" (UniqueName: \"kubernetes.io/projected/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-kube-api-access-zqjr7\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866232 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-scripts\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866254 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-logs\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866380 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-config-data\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.866410 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-tls-certs\") pod \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\" (UID: \"499a09a1-c3aa-4fa2-95a8-0d2896a1d978\") " Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.871943 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.873319 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-logs" (OuterVolumeSpecName: "logs") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.895766 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-kube-api-access-zqjr7" (OuterVolumeSpecName: "kube-api-access-zqjr7") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "kube-api-access-zqjr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.924313 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-scripts" (OuterVolumeSpecName: "scripts") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.925427 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-config-data" (OuterVolumeSpecName: "config-data") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.933559 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.956475 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "499a09a1-c3aa-4fa2-95a8-0d2896a1d978" (UID: "499a09a1-c3aa-4fa2-95a8-0d2896a1d978"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970543 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970576 4904 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970584 4904 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970594 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970604 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjr7\" (UniqueName: \"kubernetes.io/projected/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-kube-api-access-zqjr7\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970612 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:06 crc kubenswrapper[4904]: I1205 20:34:06.970620 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/499a09a1-c3aa-4fa2-95a8-0d2896a1d978-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.125608 4904 generic.go:334] "Generic (PLEG): container finished" podID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerID="0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323" exitCode=137 Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.126853 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6867ddbddb-4lg6w" Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.128159 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6867ddbddb-4lg6w" event={"ID":"499a09a1-c3aa-4fa2-95a8-0d2896a1d978","Type":"ContainerDied","Data":"0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323"} Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.128191 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6867ddbddb-4lg6w" event={"ID":"499a09a1-c3aa-4fa2-95a8-0d2896a1d978","Type":"ContainerDied","Data":"fda7bee0b7586a880bbf4c0b492e3a65af9c7f0568ea729b26f3fa2c9af5d5f4"} Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.128216 4904 scope.go:117] "RemoveContainer" containerID="a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36" Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.171972 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6867ddbddb-4lg6w"] Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.181066 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6867ddbddb-4lg6w"] Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.347923 4904 scope.go:117] "RemoveContainer" containerID="0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323" Dec 05 20:34:07 crc kubenswrapper[4904]: I1205 20:34:07.694481 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" path="/var/lib/kubelet/pods/499a09a1-c3aa-4fa2-95a8-0d2896a1d978/volumes" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.018141 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.018452 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-central-agent" containerID="cri-o://2cc78c62fb8e1dcc2158b1556b61f95acf4de6f3de4b5a4d4c22cffeaba7f810" gracePeriod=30 Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.018518 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="sg-core" containerID="cri-o://3e2fe9a254041522da6aeaf973be20e4cb9764e395d06333b40697f5200a4dc4" gracePeriod=30 Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.018596 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="proxy-httpd" containerID="cri-o://472ef7e583de871c1a0f30ae6093887d3a92927a3167df16e72b3fb9c3cd5997" gracePeriod=30 Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.018580 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-notification-agent" containerID="cri-o://d7b855b72762834437d85a3dc80bc80e204e313e62a09e8a31139301653917ca" gracePeriod=30 Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.032579 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": EOF" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.459520 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.459571 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.503127 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.503479 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.671278 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76bc85d8c5-xr857"] Dec 05 20:34:09 crc kubenswrapper[4904]: E1205 20:34:09.671775 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerName="init" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.671799 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerName="init" Dec 05 20:34:09 crc kubenswrapper[4904]: E1205 20:34:09.671827 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerName="dnsmasq-dns" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.671841 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerName="dnsmasq-dns" Dec 05 20:34:09 crc kubenswrapper[4904]: E1205 20:34:09.671871 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.671904 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api" Dec 05 20:34:09 crc kubenswrapper[4904]: E1205 20:34:09.671940 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon-log" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.671954 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon-log" Dec 05 20:34:09 crc kubenswrapper[4904]: E1205 20:34:09.671971 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api-log" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.671984 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api-log" Dec 05 20:34:09 crc kubenswrapper[4904]: E1205 20:34:09.672007 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.672017 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.672308 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.672336 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa79fda-f8c4-4866-8a82-5e2bc4cdb317" containerName="barbican-api-log" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.672352 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab0d2a9-0213-464a-b0a9-3e7338d7f40d" containerName="dnsmasq-dns" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.672376 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.672393 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="499a09a1-c3aa-4fa2-95a8-0d2896a1d978" containerName="horizon-log" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.673773 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.689313 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.689482 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.689481 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.712821 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76bc85d8c5-xr857"] Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723130 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-public-tls-certs\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723176 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq77r\" (UniqueName: \"kubernetes.io/projected/26484d4c-3765-4214-81e6-af49ebfde502-kube-api-access-xq77r\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723298 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26484d4c-3765-4214-81e6-af49ebfde502-etc-swift\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723352 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-combined-ca-bundle\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723383 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26484d4c-3765-4214-81e6-af49ebfde502-run-httpd\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723404 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-internal-tls-certs\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723438 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-config-data\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.723487 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26484d4c-3765-4214-81e6-af49ebfde502-log-httpd\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828074 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-config-data\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828175 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26484d4c-3765-4214-81e6-af49ebfde502-log-httpd\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828218 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-public-tls-certs\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828239 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq77r\" (UniqueName: \"kubernetes.io/projected/26484d4c-3765-4214-81e6-af49ebfde502-kube-api-access-xq77r\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828386 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26484d4c-3765-4214-81e6-af49ebfde502-etc-swift\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828419 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-combined-ca-bundle\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828439 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26484d4c-3765-4214-81e6-af49ebfde502-run-httpd\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.828456 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-internal-tls-certs\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.829515 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26484d4c-3765-4214-81e6-af49ebfde502-log-httpd\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.833327 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26484d4c-3765-4214-81e6-af49ebfde502-run-httpd\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.836540 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-internal-tls-certs\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.837270 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26484d4c-3765-4214-81e6-af49ebfde502-etc-swift\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.838806 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-config-data\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.839887 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-combined-ca-bundle\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.843159 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26484d4c-3765-4214-81e6-af49ebfde502-public-tls-certs\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.849514 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq77r\" (UniqueName: \"kubernetes.io/projected/26484d4c-3765-4214-81e6-af49ebfde502-kube-api-access-xq77r\") pod \"swift-proxy-76bc85d8c5-xr857\" (UID: \"26484d4c-3765-4214-81e6-af49ebfde502\") " pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:09 crc kubenswrapper[4904]: I1205 20:34:09.944980 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.042723 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178499 4904 generic.go:334] "Generic (PLEG): container finished" podID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerID="472ef7e583de871c1a0f30ae6093887d3a92927a3167df16e72b3fb9c3cd5997" exitCode=0 Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178536 4904 generic.go:334] "Generic (PLEG): container finished" podID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerID="3e2fe9a254041522da6aeaf973be20e4cb9764e395d06333b40697f5200a4dc4" exitCode=2 Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178546 4904 generic.go:334] "Generic (PLEG): container finished" podID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerID="d7b855b72762834437d85a3dc80bc80e204e313e62a09e8a31139301653917ca" exitCode=0 Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178557 4904 generic.go:334] "Generic (PLEG): container finished" podID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerID="2cc78c62fb8e1dcc2158b1556b61f95acf4de6f3de4b5a4d4c22cffeaba7f810" exitCode=0 Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178576 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerDied","Data":"472ef7e583de871c1a0f30ae6093887d3a92927a3167df16e72b3fb9c3cd5997"} Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178635 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerDied","Data":"3e2fe9a254041522da6aeaf973be20e4cb9764e395d06333b40697f5200a4dc4"} Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178654 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerDied","Data":"d7b855b72762834437d85a3dc80bc80e204e313e62a09e8a31139301653917ca"} Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.178668 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerDied","Data":"2cc78c62fb8e1dcc2158b1556b61f95acf4de6f3de4b5a4d4c22cffeaba7f810"} Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.179353 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:34:10 crc kubenswrapper[4904]: I1205 20:34:10.179379 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.194015 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-log" containerID="cri-o://7110bc0044761f07c1e167671039cc00b558f91170b014d1d3bf9cff6053ce44" gracePeriod=30 Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.194073 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-httpd" containerID="cri-o://30bfd9e811f59d4ab51bae052f4be94586035087eb09c057dd12686c5517ef4b" gracePeriod=30 Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.199967 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.184:9292/healthcheck\": EOF" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.213545 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.184:9292/healthcheck\": EOF" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.233536 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.244086 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.467573 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.467623 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.468787 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.516293 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.516369 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.808812 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:11 crc kubenswrapper[4904]: I1205 20:34:11.809594 4904 scope.go:117] "RemoveContainer" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" Dec 05 20:34:11 crc kubenswrapper[4904]: E1205 20:34:11.809816 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(28e509d6-d15b-44e6-9afa-05a347c2a7a5)\"" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.215125 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-z45b2"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.217381 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.233336 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-z45b2"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.247364 4904 generic.go:334] "Generic (PLEG): container finished" podID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerID="7110bc0044761f07c1e167671039cc00b558f91170b014d1d3bf9cff6053ce44" exitCode=143 Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.247566 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f","Type":"ContainerDied","Data":"7110bc0044761f07c1e167671039cc00b558f91170b014d1d3bf9cff6053ce44"} Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.248988 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.252671 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-log" containerID="cri-o://f897710093cc220d9271b40416c5886d5c7ac2b62ceb1bc144595f0eea4c5a94" gracePeriod=30 Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.252857 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-httpd" containerID="cri-o://a359a51682bb749e3d6325a7c7fcc8f67d2298823c165d3a1925ae9b0ab4c6f3" gracePeriod=30 Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.287654 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.287985 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.185:9292/healthcheck\": read tcp 10.217.0.2:45794->10.217.0.185:9292: read: connection reset by peer" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.311547 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mq95x"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.311817 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605b48e0-13b6-47bb-9486-7c29c947e915-operator-scripts\") pod \"nova-api-db-create-z45b2\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.312120 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxk5\" (UniqueName: \"kubernetes.io/projected/605b48e0-13b6-47bb-9486-7c29c947e915-kube-api-access-4kxk5\") pod \"nova-api-db-create-z45b2\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.324145 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.325480 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8dd6-account-create-update-t2wb6"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.329978 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.330107 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.340724 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.344559 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mq95x"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.354135 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8dd6-account-create-update-t2wb6"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.413604 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxk5\" (UniqueName: \"kubernetes.io/projected/605b48e0-13b6-47bb-9486-7c29c947e915-kube-api-access-4kxk5\") pod \"nova-api-db-create-z45b2\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.413877 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605b48e0-13b6-47bb-9486-7c29c947e915-operator-scripts\") pod \"nova-api-db-create-z45b2\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.414075 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf88911a-3e40-4dbe-9cab-11ac4077b33b-operator-scripts\") pod \"nova-cell0-db-create-mq95x\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.414188 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2sh\" (UniqueName: \"kubernetes.io/projected/bf88911a-3e40-4dbe-9cab-11ac4077b33b-kube-api-access-kw2sh\") pod \"nova-cell0-db-create-mq95x\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.414278 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a21e14cd-7505-4d92-8de4-a983351c6d9a-operator-scripts\") pod \"nova-api-8dd6-account-create-update-t2wb6\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.414345 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgbb\" (UniqueName: \"kubernetes.io/projected/a21e14cd-7505-4d92-8de4-a983351c6d9a-kube-api-access-fdgbb\") pod \"nova-api-8dd6-account-create-update-t2wb6\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.414843 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605b48e0-13b6-47bb-9486-7c29c947e915-operator-scripts\") pod \"nova-api-db-create-z45b2\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.417837 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-m5z4d"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.419342 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.454300 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxk5\" (UniqueName: \"kubernetes.io/projected/605b48e0-13b6-47bb-9486-7c29c947e915-kube-api-access-4kxk5\") pod \"nova-api-db-create-z45b2\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.459194 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m5z4d"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.521314 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf88911a-3e40-4dbe-9cab-11ac4077b33b-operator-scripts\") pod \"nova-cell0-db-create-mq95x\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.521755 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw2sh\" (UniqueName: \"kubernetes.io/projected/bf88911a-3e40-4dbe-9cab-11ac4077b33b-kube-api-access-kw2sh\") pod \"nova-cell0-db-create-mq95x\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.521819 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a21e14cd-7505-4d92-8de4-a983351c6d9a-operator-scripts\") pod \"nova-api-8dd6-account-create-update-t2wb6\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.521847 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgbb\" (UniqueName: \"kubernetes.io/projected/a21e14cd-7505-4d92-8de4-a983351c6d9a-kube-api-access-fdgbb\") pod \"nova-api-8dd6-account-create-update-t2wb6\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.522745 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a21e14cd-7505-4d92-8de4-a983351c6d9a-operator-scripts\") pod \"nova-api-8dd6-account-create-update-t2wb6\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.523912 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf88911a-3e40-4dbe-9cab-11ac4077b33b-operator-scripts\") pod \"nova-cell0-db-create-mq95x\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.542405 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.549713 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgbb\" (UniqueName: \"kubernetes.io/projected/a21e14cd-7505-4d92-8de4-a983351c6d9a-kube-api-access-fdgbb\") pod \"nova-api-8dd6-account-create-update-t2wb6\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.549917 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw2sh\" (UniqueName: \"kubernetes.io/projected/bf88911a-3e40-4dbe-9cab-11ac4077b33b-kube-api-access-kw2sh\") pod \"nova-cell0-db-create-mq95x\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.554328 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8ea0-account-create-update-rvd2d"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.555710 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.564098 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.578571 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8ea0-account-create-update-rvd2d"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.624834 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86f7\" (UniqueName: \"kubernetes.io/projected/be427ec6-7dd9-4285-b52c-3a797793ca88-kube-api-access-k86f7\") pod \"nova-cell0-8ea0-account-create-update-rvd2d\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.624951 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097e7395-bc3a-4034-bc78-bc3d86757a70-operator-scripts\") pod \"nova-cell1-db-create-m5z4d\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.624981 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7tc\" (UniqueName: \"kubernetes.io/projected/097e7395-bc3a-4034-bc78-bc3d86757a70-kube-api-access-lz7tc\") pod \"nova-cell1-db-create-m5z4d\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.625107 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be427ec6-7dd9-4285-b52c-3a797793ca88-operator-scripts\") pod \"nova-cell0-8ea0-account-create-update-rvd2d\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.633896 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.185:9292/healthcheck\": read tcp 10.217.0.2:45808->10.217.0.185:9292: read: connection reset by peer" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.671371 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.689512 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.726768 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k86f7\" (UniqueName: \"kubernetes.io/projected/be427ec6-7dd9-4285-b52c-3a797793ca88-kube-api-access-k86f7\") pod \"nova-cell0-8ea0-account-create-update-rvd2d\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.726847 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097e7395-bc3a-4034-bc78-bc3d86757a70-operator-scripts\") pod \"nova-cell1-db-create-m5z4d\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.726872 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7tc\" (UniqueName: \"kubernetes.io/projected/097e7395-bc3a-4034-bc78-bc3d86757a70-kube-api-access-lz7tc\") pod \"nova-cell1-db-create-m5z4d\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.726920 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be427ec6-7dd9-4285-b52c-3a797793ca88-operator-scripts\") pod \"nova-cell0-8ea0-account-create-update-rvd2d\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.727670 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be427ec6-7dd9-4285-b52c-3a797793ca88-operator-scripts\") pod \"nova-cell0-8ea0-account-create-update-rvd2d\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.728792 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097e7395-bc3a-4034-bc78-bc3d86757a70-operator-scripts\") pod \"nova-cell1-db-create-m5z4d\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.750630 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7tc\" (UniqueName: \"kubernetes.io/projected/097e7395-bc3a-4034-bc78-bc3d86757a70-kube-api-access-lz7tc\") pod \"nova-cell1-db-create-m5z4d\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.752021 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86f7\" (UniqueName: \"kubernetes.io/projected/be427ec6-7dd9-4285-b52c-3a797793ca88-kube-api-access-k86f7\") pod \"nova-cell0-8ea0-account-create-update-rvd2d\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.753718 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.857680 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fb4d-account-create-update-fzf7v"] Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.860398 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.864662 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 20:34:12 crc kubenswrapper[4904]: I1205 20:34:12.874118 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb4d-account-create-update-fzf7v"] Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.028468 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.035492 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkzzl\" (UniqueName: \"kubernetes.io/projected/ebfa100a-6ac4-441e-a2d8-78384458fd67-kube-api-access-rkzzl\") pod \"nova-cell1-fb4d-account-create-update-fzf7v\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.035705 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfa100a-6ac4-441e-a2d8-78384458fd67-operator-scripts\") pod \"nova-cell1-fb4d-account-create-update-fzf7v\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.137391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfa100a-6ac4-441e-a2d8-78384458fd67-operator-scripts\") pod \"nova-cell1-fb4d-account-create-update-fzf7v\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.137731 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkzzl\" (UniqueName: \"kubernetes.io/projected/ebfa100a-6ac4-441e-a2d8-78384458fd67-kube-api-access-rkzzl\") pod \"nova-cell1-fb4d-account-create-update-fzf7v\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.138394 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfa100a-6ac4-441e-a2d8-78384458fd67-operator-scripts\") pod \"nova-cell1-fb4d-account-create-update-fzf7v\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.156423 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkzzl\" (UniqueName: \"kubernetes.io/projected/ebfa100a-6ac4-441e-a2d8-78384458fd67-kube-api-access-rkzzl\") pod \"nova-cell1-fb4d-account-create-update-fzf7v\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.195084 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.263015 4904 generic.go:334] "Generic (PLEG): container finished" podID="1fae0189-1658-4a53-b223-22d70555b03d" containerID="a359a51682bb749e3d6325a7c7fcc8f67d2298823c165d3a1925ae9b0ab4c6f3" exitCode=0 Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.263047 4904 generic.go:334] "Generic (PLEG): container finished" podID="1fae0189-1658-4a53-b223-22d70555b03d" containerID="f897710093cc220d9271b40416c5886d5c7ac2b62ceb1bc144595f0eea4c5a94" exitCode=143 Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.263145 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fae0189-1658-4a53-b223-22d70555b03d","Type":"ContainerDied","Data":"a359a51682bb749e3d6325a7c7fcc8f67d2298823c165d3a1925ae9b0ab4c6f3"} Dec 05 20:34:13 crc kubenswrapper[4904]: I1205 20:34:13.263197 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fae0189-1658-4a53-b223-22d70555b03d","Type":"ContainerDied","Data":"f897710093cc220d9271b40416c5886d5c7ac2b62ceb1bc144595f0eea4c5a94"} Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.506686 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.184:9292/healthcheck\": read tcp 10.217.0.2:50460->10.217.0.184:9292: read: connection reset by peer" Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.506711 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.184:9292/healthcheck\": read tcp 10.217.0.2:50474->10.217.0.184:9292: read: connection reset by peer" Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.596912 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": dial tcp 10.217.0.180:3000: connect: connection refused" Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.623853 4904 scope.go:117] "RemoveContainer" containerID="a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36" Dec 05 20:34:15 crc kubenswrapper[4904]: E1205 20:34:15.624479 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36\": container with ID starting with a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36 not found: ID does not exist" containerID="a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36" Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.624536 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36"} err="failed to get container status \"a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36\": rpc error: code = NotFound desc = could not find container \"a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36\": container with ID starting with a9ac4d704d56fa9e65fdcd098de894d40f8d4c621808f3eac80c3507dbabdb36 not found: ID does not exist" Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.624572 4904 scope.go:117] "RemoveContainer" containerID="0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323" Dec 05 20:34:15 crc kubenswrapper[4904]: E1205 20:34:15.625223 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323\": container with ID starting with 0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323 not found: ID does not exist" containerID="0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323" Dec 05 20:34:15 crc kubenswrapper[4904]: I1205 20:34:15.625258 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323"} err="failed to get container status \"0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323\": rpc error: code = NotFound desc = could not find container \"0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323\": container with ID starting with 0916c660d49c3a8297e43c89f40cbc411d8c61a91596fba1bdc6342cd2bb8323 not found: ID does not exist" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.176219 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.236601 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-sg-core-conf-yaml\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.239999 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-log-httpd\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.240455 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.240514 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-run-httpd\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.240572 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l48hl\" (UniqueName: \"kubernetes.io/projected/e22555a8-bc28-4b25-bd10-4fc9525276c3-kube-api-access-l48hl\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.240719 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-combined-ca-bundle\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.240748 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-config-data\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.243435 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.243728 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-scripts\") pod \"e22555a8-bc28-4b25-bd10-4fc9525276c3\" (UID: \"e22555a8-bc28-4b25-bd10-4fc9525276c3\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.244470 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.244482 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22555a8-bc28-4b25-bd10-4fc9525276c3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.247212 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22555a8-bc28-4b25-bd10-4fc9525276c3-kube-api-access-l48hl" (OuterVolumeSpecName: "kube-api-access-l48hl") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "kube-api-access-l48hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.248136 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-scripts" (OuterVolumeSpecName: "scripts") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.323494 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.328471 4904 generic.go:334] "Generic (PLEG): container finished" podID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerID="30bfd9e811f59d4ab51bae052f4be94586035087eb09c057dd12686c5517ef4b" exitCode=0 Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.328535 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f","Type":"ContainerDied","Data":"30bfd9e811f59d4ab51bae052f4be94586035087eb09c057dd12686c5517ef4b"} Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.345316 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22555a8-bc28-4b25-bd10-4fc9525276c3","Type":"ContainerDied","Data":"edc7efab9b36738c5e164ee9cd56f4caf4a6ef75058894228d9d958b84a26d14"} Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.345365 4904 scope.go:117] "RemoveContainer" containerID="472ef7e583de871c1a0f30ae6093887d3a92927a3167df16e72b3fb9c3cd5997" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.345470 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.348874 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.348900 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.348911 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l48hl\" (UniqueName: \"kubernetes.io/projected/e22555a8-bc28-4b25-bd10-4fc9525276c3-kube-api-access-l48hl\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.410198 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.450653 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.486974 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.520965 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-config-data" (OuterVolumeSpecName: "config-data") pod "e22555a8-bc28-4b25-bd10-4fc9525276c3" (UID: "e22555a8-bc28-4b25-bd10-4fc9525276c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.525022 4904 scope.go:117] "RemoveContainer" containerID="3e2fe9a254041522da6aeaf973be20e4cb9764e395d06333b40697f5200a4dc4" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551489 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-scripts\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551565 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551592 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfcc\" (UniqueName: \"kubernetes.io/projected/1fae0189-1658-4a53-b223-22d70555b03d-kube-api-access-rnfcc\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551620 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-httpd-run\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551690 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-config-data\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551705 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-logs\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551754 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-internal-tls-certs\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.551825 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-combined-ca-bundle\") pod \"1fae0189-1658-4a53-b223-22d70555b03d\" (UID: \"1fae0189-1658-4a53-b223-22d70555b03d\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.552176 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.552279 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22555a8-bc28-4b25-bd10-4fc9525276c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.552291 4904 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.559509 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.564514 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-logs" (OuterVolumeSpecName: "logs") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.583970 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fae0189-1658-4a53-b223-22d70555b03d-kube-api-access-rnfcc" (OuterVolumeSpecName: "kube-api-access-rnfcc") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "kube-api-access-rnfcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.593814 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-scripts" (OuterVolumeSpecName: "scripts") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.605736 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.606375 4904 scope.go:117] "RemoveContainer" containerID="d7b855b72762834437d85a3dc80bc80e204e313e62a09e8a31139301653917ca" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.647743 4904 scope.go:117] "RemoveContainer" containerID="2cc78c62fb8e1dcc2158b1556b61f95acf4de6f3de4b5a4d4c22cffeaba7f810" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.655330 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.655361 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.655388 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.655399 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfcc\" (UniqueName: \"kubernetes.io/projected/1fae0189-1658-4a53-b223-22d70555b03d-kube-api-access-rnfcc\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.655408 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fae0189-1658-4a53-b223-22d70555b03d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.660257 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.706612 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-config-data" (OuterVolumeSpecName: "config-data") pod "1fae0189-1658-4a53-b223-22d70555b03d" (UID: "1fae0189-1658-4a53-b223-22d70555b03d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.730412 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.758219 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.758244 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.758253 4904 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fae0189-1658-4a53-b223-22d70555b03d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.792601 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.813173 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.822113 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859668 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-scripts\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859713 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-config-data\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859741 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-httpd-run\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859800 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859825 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-logs\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859875 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-public-tls-certs\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859902 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctp7k\" (UniqueName: \"kubernetes.io/projected/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-kube-api-access-ctp7k\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.859961 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-combined-ca-bundle\") pod \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\" (UID: \"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f\") " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.861450 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.866436 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-scripts" (OuterVolumeSpecName: "scripts") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.866765 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-logs" (OuterVolumeSpecName: "logs") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.873464 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.898716 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899156 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899167 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899195 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="proxy-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899201 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="proxy-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899211 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-log" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899217 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-log" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899229 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="sg-core" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899234 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="sg-core" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899242 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-log" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899247 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-log" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899259 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899265 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899277 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-central-agent" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899282 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-central-agent" Dec 05 20:34:16 crc kubenswrapper[4904]: E1205 20:34:16.899305 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-notification-agent" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899311 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-notification-agent" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899477 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-log" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899487 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-log" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899503 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fae0189-1658-4a53-b223-22d70555b03d" containerName="glance-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899512 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-central-agent" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899520 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" containerName="glance-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899530 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="ceilometer-notification-agent" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899539 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="sg-core" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.899549 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" containerName="proxy-httpd" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.901134 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.916532 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-kube-api-access-ctp7k" (OuterVolumeSpecName: "kube-api-access-ctp7k") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "kube-api-access-ctp7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.918373 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.918571 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963276 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-config-data\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963341 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963407 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-scripts\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963440 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-run-httpd\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963467 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-log-httpd\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963499 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963523 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9498\" (UniqueName: \"kubernetes.io/projected/aa055141-7081-425f-80c1-9330cd21dc39-kube-api-access-k9498\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963622 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963637 4904 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963661 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963673 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.963684 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctp7k\" (UniqueName: \"kubernetes.io/projected/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-kube-api-access-ctp7k\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4904]: I1205 20:34:16.969827 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.033595 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-config-data" (OuterVolumeSpecName: "config-data") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.060586 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.061775 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mq95x"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.066585 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-scripts\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.066678 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-run-httpd\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067278 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-log-httpd\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067412 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-run-httpd\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067470 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-log-httpd\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067542 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067632 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9498\" (UniqueName: \"kubernetes.io/projected/aa055141-7081-425f-80c1-9330cd21dc39-kube-api-access-k9498\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067729 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-config-data\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067763 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067920 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.067932 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.070979 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.077321 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.077376 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-scripts\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.078375 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-config-data\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.101542 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9498\" (UniqueName: \"kubernetes.io/projected/aa055141-7081-425f-80c1-9330cd21dc39-kube-api-access-k9498\") pod \"ceilometer-0\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.195622 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.201475 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" (UID: "ef0ab8b0-7102-4c25-9f0f-3796ff32d88f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.235838 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76bc85d8c5-xr857"] Dec 05 20:34:17 crc kubenswrapper[4904]: W1205 20:34:17.244858 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26484d4c_3765_4214_81e6_af49ebfde502.slice/crio-d44cc7a2dd9c5139e2f46935ec1ce45fbfa3ecaef43b7ee5bf49b43a79f2d92a WatchSource:0}: Error finding container d44cc7a2dd9c5139e2f46935ec1ce45fbfa3ecaef43b7ee5bf49b43a79f2d92a: Status 404 returned error can't find the container with id d44cc7a2dd9c5139e2f46935ec1ce45fbfa3ecaef43b7ee5bf49b43a79f2d92a Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.276803 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.276831 4904 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.362023 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef0ab8b0-7102-4c25-9f0f-3796ff32d88f","Type":"ContainerDied","Data":"a96b12c6c6ddc703d50443da345bebdbb2a133c9409c0e8766a927915e8e5665"} Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.362087 4904 scope.go:117] "RemoveContainer" containerID="30bfd9e811f59d4ab51bae052f4be94586035087eb09c057dd12686c5517ef4b" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.362136 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.365956 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.437592 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mq95x" event={"ID":"bf88911a-3e40-4dbe-9cab-11ac4077b33b","Type":"ContainerStarted","Data":"646c0b41fb14533e9737e9d6f2e960f1b0f4c4963bf5f006643d0dc84e6809e3"} Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.447077 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.456243 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76bc85d8c5-xr857" event={"ID":"26484d4c-3765-4214-81e6-af49ebfde502","Type":"ContainerStarted","Data":"d44cc7a2dd9c5139e2f46935ec1ce45fbfa3ecaef43b7ee5bf49b43a79f2d92a"} Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.477809 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.491130 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.491182 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.491248 4904 scope.go:117] "RemoveContainer" containerID="7110bc0044761f07c1e167671039cc00b558f91170b014d1d3bf9cff6053ce44" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.499670 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fae0189-1658-4a53-b223-22d70555b03d","Type":"ContainerDied","Data":"b0ba2cce827bdf96a9ffc160ea65a297617f3a099e772ac84ef8a94f67b9875b"} Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.499786 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.505877 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.506090 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwqht" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.506193 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.506295 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.508289 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb4d-account-create-update-fzf7v"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.516666 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c8a50daf-6f3c-405d-b047-12a11ac0b56b","Type":"ContainerStarted","Data":"406e52684ca9a61ada70d76a61976e1f44d8cc5562b52c72de60799162bf8a25"} Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.521005 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8ea0-account-create-update-rvd2d"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590707 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590756 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590793 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkj6\" (UniqueName: \"kubernetes.io/projected/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-kube-api-access-hwkj6\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590810 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590907 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590928 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-logs\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590943 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.590958 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.609390 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m5z4d"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.655238 4904 scope.go:117] "RemoveContainer" containerID="a359a51682bb749e3d6325a7c7fcc8f67d2298823c165d3a1925ae9b0ab4c6f3" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.670261 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692255 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692293 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692325 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkj6\" (UniqueName: \"kubernetes.io/projected/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-kube-api-access-hwkj6\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692343 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692421 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692445 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-logs\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692464 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.692483 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.697800 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.162550581 podStartE2EDuration="16.697780006s" podCreationTimestamp="2025-12-05 20:34:01 +0000 UTC" firstStartedPulling="2025-12-05 20:34:02.484656908 +0000 UTC m=+1341.295873017" lastFinishedPulling="2025-12-05 20:34:16.019886333 +0000 UTC m=+1354.831102442" observedRunningTime="2025-12-05 20:34:17.549302507 +0000 UTC m=+1356.360518616" watchObservedRunningTime="2025-12-05 20:34:17.697780006 +0000 UTC m=+1356.508996105" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.700888 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.701819 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.702948 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.710450 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-logs\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.715474 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.717153 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.718913 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22555a8-bc28-4b25-bd10-4fc9525276c3" path="/var/lib/kubelet/pods/e22555a8-bc28-4b25-bd10-4fc9525276c3/volumes" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.719899 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef0ab8b0-7102-4c25-9f0f-3796ff32d88f" path="/var/lib/kubelet/pods/ef0ab8b0-7102-4c25-9f0f-3796ff32d88f/volumes" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.720816 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.720893 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkj6\" (UniqueName: \"kubernetes.io/projected/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-kube-api-access-hwkj6\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.733592 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.754359 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e\") " pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.754588 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: W1205 20:34:17.756657 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod605b48e0_13b6_47bb_9486_7c29c947e915.slice/crio-3399607a032d2963531446c871ad958c03850317a8e060fc54876aec2b60fd1c WatchSource:0}: Error finding container 3399607a032d2963531446c871ad958c03850317a8e060fc54876aec2b60fd1c: Status 404 returned error can't find the container with id 3399607a032d2963531446c871ad958c03850317a8e060fc54876aec2b60fd1c Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.772653 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.774171 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.776190 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.777134 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.784359 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.800576 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-z45b2"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.817065 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8dd6-account-create-update-t2wb6"] Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.858985 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904422 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904510 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904561 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904630 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904667 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904741 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904822 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrq7q\" (UniqueName: \"kubernetes.io/projected/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-kube-api-access-qrq7q\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:17 crc kubenswrapper[4904]: I1205 20:34:17.904874 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.045303 4904 scope.go:117] "RemoveContainer" containerID="f897710093cc220d9271b40416c5886d5c7ac2b62ceb1bc144595f0eea4c5a94" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.046964 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047014 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047070 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047137 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047196 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047297 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047355 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrq7q\" (UniqueName: \"kubernetes.io/projected/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-kube-api-access-qrq7q\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047371 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.047985 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.048493 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.048657 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.053083 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.055167 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.055533 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.055822 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.078680 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrq7q\" (UniqueName: \"kubernetes.io/projected/c1b2cda4-e597-4027-b9b1-cf52ec98dcb8-kube-api-access-qrq7q\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.099025 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.204768 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.272607 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.544857 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" event={"ID":"ebfa100a-6ac4-441e-a2d8-78384458fd67","Type":"ContainerStarted","Data":"399c17b9eab778eecd4bbb77b6381cef2dc8225840f0f905e537be42830ad005"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.546219 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" event={"ID":"ebfa100a-6ac4-441e-a2d8-78384458fd67","Type":"ContainerStarted","Data":"d78195ab51046cb43f21f7d2e7dde36f8722dde789f25f802932c7f6c64a0391"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.587611 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" podStartSLOduration=6.587593232 podStartE2EDuration="6.587593232s" podCreationTimestamp="2025-12-05 20:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:18.57537747 +0000 UTC m=+1357.386593579" watchObservedRunningTime="2025-12-05 20:34:18.587593232 +0000 UTC m=+1357.398809341" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.593360 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-z45b2" event={"ID":"605b48e0-13b6-47bb-9486-7c29c947e915","Type":"ContainerStarted","Data":"3399607a032d2963531446c871ad958c03850317a8e060fc54876aec2b60fd1c"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.595831 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m5z4d" event={"ID":"097e7395-bc3a-4034-bc78-bc3d86757a70","Type":"ContainerStarted","Data":"3fdac7f162c47107c252df4e5faa92a9dd59246fcc3fdb0df0942a8fdfa921e8"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.620286 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-m5z4d" podStartSLOduration=6.620267502 podStartE2EDuration="6.620267502s" podCreationTimestamp="2025-12-05 20:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:18.618296621 +0000 UTC m=+1357.429512730" watchObservedRunningTime="2025-12-05 20:34:18.620267502 +0000 UTC m=+1357.431483611" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.636975 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" event={"ID":"be427ec6-7dd9-4285-b52c-3a797793ca88","Type":"ContainerStarted","Data":"15ec064905ae598d515b775f44fc5816bb7cc6bb5ad621cd03fdd1c25fce23d4"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.637453 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" event={"ID":"be427ec6-7dd9-4285-b52c-3a797793ca88","Type":"ContainerStarted","Data":"538408f1fa0e0e1b724c792fb225c131787e7e844aa57a813e31b564f9d9e40d"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.647533 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" event={"ID":"a21e14cd-7505-4d92-8de4-a983351c6d9a","Type":"ContainerStarted","Data":"c0d4254eb1ceb2984a9dae5fd15cb8426a74d073044fbda84b9c899e0b41e03b"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.647754 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" event={"ID":"a21e14cd-7505-4d92-8de4-a983351c6d9a","Type":"ContainerStarted","Data":"9da222151d8eaa68026b3d5b287b2d95a4e091e4f08cc1dc20da3f540eabefba"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.712261 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" podStartSLOduration=6.7122355030000005 podStartE2EDuration="6.712235503s" podCreationTimestamp="2025-12-05 20:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:18.706563714 +0000 UTC m=+1357.517779823" watchObservedRunningTime="2025-12-05 20:34:18.712235503 +0000 UTC m=+1357.523451622" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.715508 4904 generic.go:334] "Generic (PLEG): container finished" podID="bf88911a-3e40-4dbe-9cab-11ac4077b33b" containerID="6f34a1968f9774c914616262214de79af0e4edacb516f81c7ca100eddcf2cd25" exitCode=0 Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.715582 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mq95x" event={"ID":"bf88911a-3e40-4dbe-9cab-11ac4077b33b","Type":"ContainerDied","Data":"6f34a1968f9774c914616262214de79af0e4edacb516f81c7ca100eddcf2cd25"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.726786 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76bc85d8c5-xr857" event={"ID":"26484d4c-3765-4214-81e6-af49ebfde502","Type":"ContainerStarted","Data":"f9dd173b50890a8c9a65f840a1314a6ea97042ec8234a926eeeae1860bb37320"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.726834 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76bc85d8c5-xr857" event={"ID":"26484d4c-3765-4214-81e6-af49ebfde502","Type":"ContainerStarted","Data":"aeb13f7ac6222d9890d955db282d987ed8420186c60bada7267c393bf94935fa"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.727180 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.727658 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.734984 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerStarted","Data":"bf5b30eeed42c72fac67714dfaf85a1c5d1a9376644682dbd5afba026b1d8f0a"} Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.767452 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" podStartSLOduration=6.767431217 podStartE2EDuration="6.767431217s" podCreationTimestamp="2025-12-05 20:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:18.727908936 +0000 UTC m=+1357.539125045" watchObservedRunningTime="2025-12-05 20:34:18.767431217 +0000 UTC m=+1357.578647326" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.801946 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76bc85d8c5-xr857" podStartSLOduration=9.801929464 podStartE2EDuration="9.801929464s" podCreationTimestamp="2025-12-05 20:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:18.762734943 +0000 UTC m=+1357.573951062" watchObservedRunningTime="2025-12-05 20:34:18.801929464 +0000 UTC m=+1357.613145563" Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.831824 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:34:18 crc kubenswrapper[4904]: I1205 20:34:18.903078 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:34:18 crc kubenswrapper[4904]: W1205 20:34:18.910432 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b2cda4_e597_4027_b9b1_cf52ec98dcb8.slice/crio-a6d5eb2580e3035311cf3d0907812c41795ea04715d7facd1d7f926e1ff57c7b WatchSource:0}: Error finding container a6d5eb2580e3035311cf3d0907812c41795ea04715d7facd1d7f926e1ff57c7b: Status 404 returned error can't find the container with id a6d5eb2580e3035311cf3d0907812c41795ea04715d7facd1d7f926e1ff57c7b Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.705224 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fae0189-1658-4a53-b223-22d70555b03d" path="/var/lib/kubelet/pods/1fae0189-1658-4a53-b223-22d70555b03d/volumes" Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.788212 4904 generic.go:334] "Generic (PLEG): container finished" podID="097e7395-bc3a-4034-bc78-bc3d86757a70" containerID="d467899d124c42ed75c8a3219cd306e566a45a6a9aef6c12eb70dcda9fe65448" exitCode=0 Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.788267 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m5z4d" event={"ID":"097e7395-bc3a-4034-bc78-bc3d86757a70","Type":"ContainerDied","Data":"d467899d124c42ed75c8a3219cd306e566a45a6a9aef6c12eb70dcda9fe65448"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.793866 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8","Type":"ContainerStarted","Data":"c2721d5c7a1702cc3a03db3836c07d5599e162770b96b37d19eff0d9cbdd18a5"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.793927 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8","Type":"ContainerStarted","Data":"a6d5eb2580e3035311cf3d0907812c41795ea04715d7facd1d7f926e1ff57c7b"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.799506 4904 generic.go:334] "Generic (PLEG): container finished" podID="be427ec6-7dd9-4285-b52c-3a797793ca88" containerID="15ec064905ae598d515b775f44fc5816bb7cc6bb5ad621cd03fdd1c25fce23d4" exitCode=0 Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.799564 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" event={"ID":"be427ec6-7dd9-4285-b52c-3a797793ca88","Type":"ContainerDied","Data":"15ec064905ae598d515b775f44fc5816bb7cc6bb5ad621cd03fdd1c25fce23d4"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.802549 4904 generic.go:334] "Generic (PLEG): container finished" podID="a21e14cd-7505-4d92-8de4-a983351c6d9a" containerID="c0d4254eb1ceb2984a9dae5fd15cb8426a74d073044fbda84b9c899e0b41e03b" exitCode=0 Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.802584 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" event={"ID":"a21e14cd-7505-4d92-8de4-a983351c6d9a","Type":"ContainerDied","Data":"c0d4254eb1ceb2984a9dae5fd15cb8426a74d073044fbda84b9c899e0b41e03b"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.835013 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e","Type":"ContainerStarted","Data":"b217dff7252821fbce3c2668bef76aee16fa18510b31ce0cdb0d3f7cdc8646ee"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.835066 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e","Type":"ContainerStarted","Data":"389c54e7b8b568380753a75c0038d9b503eca05e99932556e51b35d6fc0157c7"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.836496 4904 generic.go:334] "Generic (PLEG): container finished" podID="ebfa100a-6ac4-441e-a2d8-78384458fd67" containerID="399c17b9eab778eecd4bbb77b6381cef2dc8225840f0f905e537be42830ad005" exitCode=0 Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.836534 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" event={"ID":"ebfa100a-6ac4-441e-a2d8-78384458fd67","Type":"ContainerDied","Data":"399c17b9eab778eecd4bbb77b6381cef2dc8225840f0f905e537be42830ad005"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.838200 4904 generic.go:334] "Generic (PLEG): container finished" podID="605b48e0-13b6-47bb-9486-7c29c947e915" containerID="272fb30c80cf98366309f6898c012cb7513721dcc7c48aeec0773ee1583a0a2a" exitCode=0 Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.838240 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-z45b2" event={"ID":"605b48e0-13b6-47bb-9486-7c29c947e915","Type":"ContainerDied","Data":"272fb30c80cf98366309f6898c012cb7513721dcc7c48aeec0773ee1583a0a2a"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.861093 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerStarted","Data":"318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf"} Dec 05 20:34:19 crc kubenswrapper[4904]: I1205 20:34:19.861126 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerStarted","Data":"086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9"} Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.503462 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.668512 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw2sh\" (UniqueName: \"kubernetes.io/projected/bf88911a-3e40-4dbe-9cab-11ac4077b33b-kube-api-access-kw2sh\") pod \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.668955 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf88911a-3e40-4dbe-9cab-11ac4077b33b-operator-scripts\") pod \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\" (UID: \"bf88911a-3e40-4dbe-9cab-11ac4077b33b\") " Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.669816 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf88911a-3e40-4dbe-9cab-11ac4077b33b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf88911a-3e40-4dbe-9cab-11ac4077b33b" (UID: "bf88911a-3e40-4dbe-9cab-11ac4077b33b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.670463 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf88911a-3e40-4dbe-9cab-11ac4077b33b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.673877 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf88911a-3e40-4dbe-9cab-11ac4077b33b-kube-api-access-kw2sh" (OuterVolumeSpecName: "kube-api-access-kw2sh") pod "bf88911a-3e40-4dbe-9cab-11ac4077b33b" (UID: "bf88911a-3e40-4dbe-9cab-11ac4077b33b"). InnerVolumeSpecName "kube-api-access-kw2sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.774980 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw2sh\" (UniqueName: \"kubernetes.io/projected/bf88911a-3e40-4dbe-9cab-11ac4077b33b-kube-api-access-kw2sh\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.871827 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mq95x" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.871822 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mq95x" event={"ID":"bf88911a-3e40-4dbe-9cab-11ac4077b33b","Type":"ContainerDied","Data":"646c0b41fb14533e9737e9d6f2e960f1b0f4c4963bf5f006643d0dc84e6809e3"} Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.871980 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="646c0b41fb14533e9737e9d6f2e960f1b0f4c4963bf5f006643d0dc84e6809e3" Dec 05 20:34:20 crc kubenswrapper[4904]: I1205 20:34:20.873414 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerStarted","Data":"e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.410070 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.491281 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097e7395-bc3a-4034-bc78-bc3d86757a70-operator-scripts\") pod \"097e7395-bc3a-4034-bc78-bc3d86757a70\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.491354 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz7tc\" (UniqueName: \"kubernetes.io/projected/097e7395-bc3a-4034-bc78-bc3d86757a70-kube-api-access-lz7tc\") pod \"097e7395-bc3a-4034-bc78-bc3d86757a70\" (UID: \"097e7395-bc3a-4034-bc78-bc3d86757a70\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.492968 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/097e7395-bc3a-4034-bc78-bc3d86757a70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "097e7395-bc3a-4034-bc78-bc3d86757a70" (UID: "097e7395-bc3a-4034-bc78-bc3d86757a70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.501204 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097e7395-bc3a-4034-bc78-bc3d86757a70-kube-api-access-lz7tc" (OuterVolumeSpecName: "kube-api-access-lz7tc") pod "097e7395-bc3a-4034-bc78-bc3d86757a70" (UID: "097e7395-bc3a-4034-bc78-bc3d86757a70"). InnerVolumeSpecName "kube-api-access-lz7tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.581344 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.594256 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097e7395-bc3a-4034-bc78-bc3d86757a70-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.594279 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz7tc\" (UniqueName: \"kubernetes.io/projected/097e7395-bc3a-4034-bc78-bc3d86757a70-kube-api-access-lz7tc\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.613178 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.657384 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.675017 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.695726 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkzzl\" (UniqueName: \"kubernetes.io/projected/ebfa100a-6ac4-441e-a2d8-78384458fd67-kube-api-access-rkzzl\") pod \"ebfa100a-6ac4-441e-a2d8-78384458fd67\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.695886 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be427ec6-7dd9-4285-b52c-3a797793ca88-operator-scripts\") pod \"be427ec6-7dd9-4285-b52c-3a797793ca88\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.695964 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfa100a-6ac4-441e-a2d8-78384458fd67-operator-scripts\") pod \"ebfa100a-6ac4-441e-a2d8-78384458fd67\" (UID: \"ebfa100a-6ac4-441e-a2d8-78384458fd67\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.695998 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86f7\" (UniqueName: \"kubernetes.io/projected/be427ec6-7dd9-4285-b52c-3a797793ca88-kube-api-access-k86f7\") pod \"be427ec6-7dd9-4285-b52c-3a797793ca88\" (UID: \"be427ec6-7dd9-4285-b52c-3a797793ca88\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.700179 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfa100a-6ac4-441e-a2d8-78384458fd67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebfa100a-6ac4-441e-a2d8-78384458fd67" (UID: "ebfa100a-6ac4-441e-a2d8-78384458fd67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.710137 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be427ec6-7dd9-4285-b52c-3a797793ca88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be427ec6-7dd9-4285-b52c-3a797793ca88" (UID: "be427ec6-7dd9-4285-b52c-3a797793ca88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.732841 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfa100a-6ac4-441e-a2d8-78384458fd67-kube-api-access-rkzzl" (OuterVolumeSpecName: "kube-api-access-rkzzl") pod "ebfa100a-6ac4-441e-a2d8-78384458fd67" (UID: "ebfa100a-6ac4-441e-a2d8-78384458fd67"). InnerVolumeSpecName "kube-api-access-rkzzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.732993 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be427ec6-7dd9-4285-b52c-3a797793ca88-kube-api-access-k86f7" (OuterVolumeSpecName: "kube-api-access-k86f7") pod "be427ec6-7dd9-4285-b52c-3a797793ca88" (UID: "be427ec6-7dd9-4285-b52c-3a797793ca88"). InnerVolumeSpecName "kube-api-access-k86f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.802256 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdgbb\" (UniqueName: \"kubernetes.io/projected/a21e14cd-7505-4d92-8de4-a983351c6d9a-kube-api-access-fdgbb\") pod \"a21e14cd-7505-4d92-8de4-a983351c6d9a\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803007 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605b48e0-13b6-47bb-9486-7c29c947e915-operator-scripts\") pod \"605b48e0-13b6-47bb-9486-7c29c947e915\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803034 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kxk5\" (UniqueName: \"kubernetes.io/projected/605b48e0-13b6-47bb-9486-7c29c947e915-kube-api-access-4kxk5\") pod \"605b48e0-13b6-47bb-9486-7c29c947e915\" (UID: \"605b48e0-13b6-47bb-9486-7c29c947e915\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803096 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a21e14cd-7505-4d92-8de4-a983351c6d9a-operator-scripts\") pod \"a21e14cd-7505-4d92-8de4-a983351c6d9a\" (UID: \"a21e14cd-7505-4d92-8de4-a983351c6d9a\") " Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803565 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfa100a-6ac4-441e-a2d8-78384458fd67-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803578 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k86f7\" (UniqueName: \"kubernetes.io/projected/be427ec6-7dd9-4285-b52c-3a797793ca88-kube-api-access-k86f7\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803588 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkzzl\" (UniqueName: \"kubernetes.io/projected/ebfa100a-6ac4-441e-a2d8-78384458fd67-kube-api-access-rkzzl\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.803597 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be427ec6-7dd9-4285-b52c-3a797793ca88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.804749 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21e14cd-7505-4d92-8de4-a983351c6d9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a21e14cd-7505-4d92-8de4-a983351c6d9a" (UID: "a21e14cd-7505-4d92-8de4-a983351c6d9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.805290 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21e14cd-7505-4d92-8de4-a983351c6d9a-kube-api-access-fdgbb" (OuterVolumeSpecName: "kube-api-access-fdgbb") pod "a21e14cd-7505-4d92-8de4-a983351c6d9a" (UID: "a21e14cd-7505-4d92-8de4-a983351c6d9a"). InnerVolumeSpecName "kube-api-access-fdgbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.805326 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/605b48e0-13b6-47bb-9486-7c29c947e915-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "605b48e0-13b6-47bb-9486-7c29c947e915" (UID: "605b48e0-13b6-47bb-9486-7c29c947e915"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.807825 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605b48e0-13b6-47bb-9486-7c29c947e915-kube-api-access-4kxk5" (OuterVolumeSpecName: "kube-api-access-4kxk5") pod "605b48e0-13b6-47bb-9486-7c29c947e915" (UID: "605b48e0-13b6-47bb-9486-7c29c947e915"). InnerVolumeSpecName "kube-api-access-4kxk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.808871 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.809431 4904 scope.go:117] "RemoveContainer" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.809792 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.888852 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" event={"ID":"ebfa100a-6ac4-441e-a2d8-78384458fd67","Type":"ContainerDied","Data":"d78195ab51046cb43f21f7d2e7dde36f8722dde789f25f802932c7f6c64a0391"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.888896 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d78195ab51046cb43f21f7d2e7dde36f8722dde789f25f802932c7f6c64a0391" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.888948 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4d-account-create-update-fzf7v" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.891030 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-z45b2" event={"ID":"605b48e0-13b6-47bb-9486-7c29c947e915","Type":"ContainerDied","Data":"3399607a032d2963531446c871ad958c03850317a8e060fc54876aec2b60fd1c"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.891090 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3399607a032d2963531446c871ad958c03850317a8e060fc54876aec2b60fd1c" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.891156 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z45b2" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.898384 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerStarted","Data":"997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.898642 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.900442 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m5z4d" event={"ID":"097e7395-bc3a-4034-bc78-bc3d86757a70","Type":"ContainerDied","Data":"3fdac7f162c47107c252df4e5faa92a9dd59246fcc3fdb0df0942a8fdfa921e8"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.900469 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fdac7f162c47107c252df4e5faa92a9dd59246fcc3fdb0df0942a8fdfa921e8" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.900561 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m5z4d" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.905581 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdgbb\" (UniqueName: \"kubernetes.io/projected/a21e14cd-7505-4d92-8de4-a983351c6d9a-kube-api-access-fdgbb\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.905614 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605b48e0-13b6-47bb-9486-7c29c947e915-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.905629 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kxk5\" (UniqueName: \"kubernetes.io/projected/605b48e0-13b6-47bb-9486-7c29c947e915-kube-api-access-4kxk5\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.905641 4904 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a21e14cd-7505-4d92-8de4-a983351c6d9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.908050 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b2cda4-e597-4027-b9b1-cf52ec98dcb8","Type":"ContainerStarted","Data":"f15cff73d15493c515c896971533c8704b1a0af4898b9337dd2e2428345209fd"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.909818 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.909818 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ea0-account-create-update-rvd2d" event={"ID":"be427ec6-7dd9-4285-b52c-3a797793ca88","Type":"ContainerDied","Data":"538408f1fa0e0e1b724c792fb225c131787e7e844aa57a813e31b564f9d9e40d"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.909865 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="538408f1fa0e0e1b724c792fb225c131787e7e844aa57a813e31b564f9d9e40d" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.911520 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.911522 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8dd6-account-create-update-t2wb6" event={"ID":"a21e14cd-7505-4d92-8de4-a983351c6d9a","Type":"ContainerDied","Data":"9da222151d8eaa68026b3d5b287b2d95a4e091e4f08cc1dc20da3f540eabefba"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.911625 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9da222151d8eaa68026b3d5b287b2d95a4e091e4f08cc1dc20da3f540eabefba" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.913070 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e","Type":"ContainerStarted","Data":"f3d71618ec60be076f88eeeb0b4bc246f0bcdfbf169f90ab50324a1d8e038fca"} Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.937888 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.850449473 podStartE2EDuration="5.937859643s" podCreationTimestamp="2025-12-05 20:34:16 +0000 UTC" firstStartedPulling="2025-12-05 20:34:18.218262119 +0000 UTC m=+1357.029478218" lastFinishedPulling="2025-12-05 20:34:21.305672279 +0000 UTC m=+1360.116888388" observedRunningTime="2025-12-05 20:34:21.922278032 +0000 UTC m=+1360.733494181" watchObservedRunningTime="2025-12-05 20:34:21.937859643 +0000 UTC m=+1360.749075742" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.983327 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.983309299 podStartE2EDuration="4.983309299s" podCreationTimestamp="2025-12-05 20:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:21.942432343 +0000 UTC m=+1360.753648462" watchObservedRunningTime="2025-12-05 20:34:21.983309299 +0000 UTC m=+1360.794525408" Dec 05 20:34:21 crc kubenswrapper[4904]: I1205 20:34:21.997026 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.997006 podStartE2EDuration="4.997006s" podCreationTimestamp="2025-12-05 20:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:21.969265049 +0000 UTC m=+1360.780481178" watchObservedRunningTime="2025-12-05 20:34:21.997006 +0000 UTC m=+1360.808222109" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.410339 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787051 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5wbq6"] Dec 05 20:34:22 crc kubenswrapper[4904]: E1205 20:34:22.787458 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097e7395-bc3a-4034-bc78-bc3d86757a70" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787470 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="097e7395-bc3a-4034-bc78-bc3d86757a70" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: E1205 20:34:22.787483 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf88911a-3e40-4dbe-9cab-11ac4077b33b" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787488 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf88911a-3e40-4dbe-9cab-11ac4077b33b" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: E1205 20:34:22.787505 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be427ec6-7dd9-4285-b52c-3a797793ca88" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787511 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="be427ec6-7dd9-4285-b52c-3a797793ca88" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: E1205 20:34:22.787523 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21e14cd-7505-4d92-8de4-a983351c6d9a" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787528 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21e14cd-7505-4d92-8de4-a983351c6d9a" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: E1205 20:34:22.787538 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfa100a-6ac4-441e-a2d8-78384458fd67" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787544 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfa100a-6ac4-441e-a2d8-78384458fd67" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: E1205 20:34:22.787553 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605b48e0-13b6-47bb-9486-7c29c947e915" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787560 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="605b48e0-13b6-47bb-9486-7c29c947e915" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787715 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="605b48e0-13b6-47bb-9486-7c29c947e915" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787731 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf88911a-3e40-4dbe-9cab-11ac4077b33b" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787741 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21e14cd-7505-4d92-8de4-a983351c6d9a" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787751 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="097e7395-bc3a-4034-bc78-bc3d86757a70" containerName="mariadb-database-create" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787766 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfa100a-6ac4-441e-a2d8-78384458fd67" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.787781 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="be427ec6-7dd9-4285-b52c-3a797793ca88" containerName="mariadb-account-create-update" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.788788 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.791519 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.795839 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.796366 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-85qgh" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.809952 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5wbq6"] Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.924352 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerStarted","Data":"61fbc222dea28f32b8df3b258bb0ee5bbd437e2fe35806b5ef58a20b2dd014fb"} Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.924791 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrf4\" (UniqueName: \"kubernetes.io/projected/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-kube-api-access-pzrf4\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.925937 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-config-data\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.925978 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:22 crc kubenswrapper[4904]: I1205 20:34:22.926106 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-scripts\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.028682 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrf4\" (UniqueName: \"kubernetes.io/projected/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-kube-api-access-pzrf4\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.029036 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-config-data\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.029198 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.029440 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-scripts\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.034785 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.034839 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-config-data\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.047608 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-scripts\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.061696 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrf4\" (UniqueName: \"kubernetes.io/projected/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-kube-api-access-pzrf4\") pod \"nova-cell0-conductor-db-sync-5wbq6\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.121035 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.465303 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5wbq6"] Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.934103 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" event={"ID":"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93","Type":"ContainerStarted","Data":"d50769c0cb264136df4f2ea2379f3c85a49feb229dc6cb836362f36bdd8eae24"} Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.934459 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-central-agent" containerID="cri-o://086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9" gracePeriod=30 Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.934571 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="proxy-httpd" containerID="cri-o://997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652" gracePeriod=30 Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.934611 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="sg-core" containerID="cri-o://e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a" gracePeriod=30 Dec 05 20:34:23 crc kubenswrapper[4904]: I1205 20:34:23.934641 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-notification-agent" containerID="cri-o://318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf" gracePeriod=30 Dec 05 20:34:24 crc kubenswrapper[4904]: I1205 20:34:24.949333 4904 generic.go:334] "Generic (PLEG): container finished" podID="aa055141-7081-425f-80c1-9330cd21dc39" containerID="997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652" exitCode=0 Dec 05 20:34:24 crc kubenswrapper[4904]: I1205 20:34:24.949562 4904 generic.go:334] "Generic (PLEG): container finished" podID="aa055141-7081-425f-80c1-9330cd21dc39" containerID="e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a" exitCode=2 Dec 05 20:34:24 crc kubenswrapper[4904]: I1205 20:34:24.949571 4904 generic.go:334] "Generic (PLEG): container finished" podID="aa055141-7081-425f-80c1-9330cd21dc39" containerID="318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf" exitCode=0 Dec 05 20:34:24 crc kubenswrapper[4904]: I1205 20:34:24.949410 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerDied","Data":"997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652"} Dec 05 20:34:24 crc kubenswrapper[4904]: I1205 20:34:24.949599 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerDied","Data":"e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a"} Dec 05 20:34:24 crc kubenswrapper[4904]: I1205 20:34:24.949610 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerDied","Data":"318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf"} Dec 05 20:34:25 crc kubenswrapper[4904]: I1205 20:34:25.050017 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:25 crc kubenswrapper[4904]: I1205 20:34:25.060453 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76bc85d8c5-xr857" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.770964 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.841582 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-sg-core-conf-yaml\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.841653 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-combined-ca-bundle\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.841771 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-config-data\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.841877 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-scripts\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.841996 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-run-httpd\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.842389 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.842034 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-log-httpd\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.842583 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9498\" (UniqueName: \"kubernetes.io/projected/aa055141-7081-425f-80c1-9330cd21dc39-kube-api-access-k9498\") pod \"aa055141-7081-425f-80c1-9330cd21dc39\" (UID: \"aa055141-7081-425f-80c1-9330cd21dc39\") " Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.842623 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.843453 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.843468 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa055141-7081-425f-80c1-9330cd21dc39-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.848201 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-scripts" (OuterVolumeSpecName: "scripts") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.850610 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa055141-7081-425f-80c1-9330cd21dc39-kube-api-access-k9498" (OuterVolumeSpecName: "kube-api-access-k9498") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "kube-api-access-k9498". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.889813 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.941243 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.946263 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.947083 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9498\" (UniqueName: \"kubernetes.io/projected/aa055141-7081-425f-80c1-9330cd21dc39-kube-api-access-k9498\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.947183 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.947354 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.963230 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-config-data" (OuterVolumeSpecName: "config-data") pod "aa055141-7081-425f-80c1-9330cd21dc39" (UID: "aa055141-7081-425f-80c1-9330cd21dc39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.971586 4904 generic.go:334] "Generic (PLEG): container finished" podID="aa055141-7081-425f-80c1-9330cd21dc39" containerID="086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9" exitCode=0 Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.971821 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerDied","Data":"086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9"} Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.972004 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa055141-7081-425f-80c1-9330cd21dc39","Type":"ContainerDied","Data":"bf5b30eeed42c72fac67714dfaf85a1c5d1a9376644682dbd5afba026b1d8f0a"} Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.972109 4904 scope.go:117] "RemoveContainer" containerID="997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652" Dec 05 20:34:26 crc kubenswrapper[4904]: I1205 20:34:26.972277 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.021787 4904 scope.go:117] "RemoveContainer" containerID="e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.030677 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.041411 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.056292 4904 scope.go:117] "RemoveContainer" containerID="318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.059245 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa055141-7081-425f-80c1-9330cd21dc39-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.081283 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.081295 4904 scope.go:117] "RemoveContainer" containerID="086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.082050 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-notification-agent" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082094 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-notification-agent" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.082144 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="proxy-httpd" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082151 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="proxy-httpd" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.082164 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="sg-core" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082170 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="sg-core" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.082187 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-central-agent" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082194 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-central-agent" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082531 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-central-agent" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082567 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="ceilometer-notification-agent" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082599 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="sg-core" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.082607 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa055141-7081-425f-80c1-9330cd21dc39" containerName="proxy-httpd" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.086117 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.092469 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.093560 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.099554 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.112623 4904 scope.go:117] "RemoveContainer" containerID="997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.113272 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652\": container with ID starting with 997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652 not found: ID does not exist" containerID="997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.113320 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652"} err="failed to get container status \"997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652\": rpc error: code = NotFound desc = could not find container \"997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652\": container with ID starting with 997f8a41af3c423393915197436368dd0b9a8ad3bba79f7117fdc1067a84e652 not found: ID does not exist" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.113350 4904 scope.go:117] "RemoveContainer" containerID="e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.124882 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a\": container with ID starting with e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a not found: ID does not exist" containerID="e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.124935 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a"} err="failed to get container status \"e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a\": rpc error: code = NotFound desc = could not find container \"e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a\": container with ID starting with e9b253468e263b742e33928f72a574172f10a9ef1f1c7a567ff27075f885de2a not found: ID does not exist" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.124964 4904 scope.go:117] "RemoveContainer" containerID="318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.125457 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf\": container with ID starting with 318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf not found: ID does not exist" containerID="318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.125491 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf"} err="failed to get container status \"318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf\": rpc error: code = NotFound desc = could not find container \"318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf\": container with ID starting with 318da8a719650afe7f68cb588f51c4dfa5a130b651fa3ff7c60839ef4416efcf not found: ID does not exist" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.125509 4904 scope.go:117] "RemoveContainer" containerID="086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9" Dec 05 20:34:27 crc kubenswrapper[4904]: E1205 20:34:27.126555 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9\": container with ID starting with 086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9 not found: ID does not exist" containerID="086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.126693 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9"} err="failed to get container status \"086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9\": rpc error: code = NotFound desc = could not find container \"086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9\": container with ID starting with 086339c85abfb7719d031cc0c3c8034fa9e9827c1465bb84238fcc899bca6fc9 not found: ID does not exist" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.160903 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zxl\" (UniqueName: \"kubernetes.io/projected/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-kube-api-access-h7zxl\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.160964 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-run-httpd\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.161024 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.161103 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-scripts\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.161135 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.161169 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-log-httpd\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.161206 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-config-data\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.262679 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-log-httpd\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.262960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-config-data\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263011 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zxl\" (UniqueName: \"kubernetes.io/projected/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-kube-api-access-h7zxl\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263067 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-run-httpd\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263102 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263143 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-scripts\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263172 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263506 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-log-httpd\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.263535 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-run-httpd\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.267814 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.267892 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-scripts\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.267895 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.268504 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-config-data\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.280118 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zxl\" (UniqueName: \"kubernetes.io/projected/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-kube-api-access-h7zxl\") pod \"ceilometer-0\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.414885 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.702041 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa055141-7081-425f-80c1-9330cd21dc39" path="/var/lib/kubelet/pods/aa055141-7081-425f-80c1-9330cd21dc39/volumes" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.860271 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.860318 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.913218 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.921262 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.955678 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:27 crc kubenswrapper[4904]: W1205 20:34:27.969326 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbe4d2bf_3297_4b17_acbb_80f5e31c338c.slice/crio-7a5e26f19c366fe5751a769f031c0e90b058f007074dd0aea3b794cb033cb00d WatchSource:0}: Error finding container 7a5e26f19c366fe5751a769f031c0e90b058f007074dd0aea3b794cb033cb00d: Status 404 returned error can't find the container with id 7a5e26f19c366fe5751a769f031c0e90b058f007074dd0aea3b794cb033cb00d Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.981477 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerStarted","Data":"7a5e26f19c366fe5751a769f031c0e90b058f007074dd0aea3b794cb033cb00d"} Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.984167 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:34:27 crc kubenswrapper[4904]: I1205 20:34:27.984205 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.273524 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.273576 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.312795 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.342313 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.427330 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.991094 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:28 crc kubenswrapper[4904]: I1205 20:34:28.991400 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:30 crc kubenswrapper[4904]: I1205 20:34:30.001591 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:34:30 crc kubenswrapper[4904]: I1205 20:34:30.001617 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:34:30 crc kubenswrapper[4904]: I1205 20:34:30.233954 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 20:34:30 crc kubenswrapper[4904]: I1205 20:34:30.267730 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 20:34:31 crc kubenswrapper[4904]: I1205 20:34:31.362026 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:31 crc kubenswrapper[4904]: I1205 20:34:31.362410 4904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:34:31 crc kubenswrapper[4904]: I1205 20:34:31.445888 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 20:34:31 crc kubenswrapper[4904]: I1205 20:34:31.809406 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:31 crc kubenswrapper[4904]: I1205 20:34:31.840861 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:32 crc kubenswrapper[4904]: I1205 20:34:32.021476 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:32 crc kubenswrapper[4904]: I1205 20:34:32.069670 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:35 crc kubenswrapper[4904]: I1205 20:34:35.056396 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerStarted","Data":"1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd"} Dec 05 20:34:35 crc kubenswrapper[4904]: I1205 20:34:35.058183 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" event={"ID":"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93","Type":"ContainerStarted","Data":"5b641c4bef02bf61d3d4d86c900709a3bb8faede9fb0a5d3d324a200a7a104f2"} Dec 05 20:34:35 crc kubenswrapper[4904]: I1205 20:34:35.077125 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" podStartSLOduration=1.759175693 podStartE2EDuration="13.077108546s" podCreationTimestamp="2025-12-05 20:34:22 +0000 UTC" firstStartedPulling="2025-12-05 20:34:23.460892498 +0000 UTC m=+1362.272108607" lastFinishedPulling="2025-12-05 20:34:34.778825351 +0000 UTC m=+1373.590041460" observedRunningTime="2025-12-05 20:34:35.073677746 +0000 UTC m=+1373.884893855" watchObservedRunningTime="2025-12-05 20:34:35.077108546 +0000 UTC m=+1373.888324655" Dec 05 20:34:36 crc kubenswrapper[4904]: I1205 20:34:36.069012 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerStarted","Data":"dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521"} Dec 05 20:34:36 crc kubenswrapper[4904]: I1205 20:34:36.069437 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerStarted","Data":"097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844"} Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.092176 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerStarted","Data":"083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5"} Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.092708 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.092453 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="sg-core" containerID="cri-o://dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521" gracePeriod=30 Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.092388 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="proxy-httpd" containerID="cri-o://083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5" gracePeriod=30 Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.092466 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-notification-agent" containerID="cri-o://097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844" gracePeriod=30 Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.092840 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-central-agent" containerID="cri-o://1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd" gracePeriod=30 Dec 05 20:34:38 crc kubenswrapper[4904]: I1205 20:34:38.117867 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.340779092 podStartE2EDuration="11.117842056s" podCreationTimestamp="2025-12-05 20:34:27 +0000 UTC" firstStartedPulling="2025-12-05 20:34:27.971171871 +0000 UTC m=+1366.782387980" lastFinishedPulling="2025-12-05 20:34:37.748234825 +0000 UTC m=+1376.559450944" observedRunningTime="2025-12-05 20:34:38.1115806 +0000 UTC m=+1376.922796729" watchObservedRunningTime="2025-12-05 20:34:38.117842056 +0000 UTC m=+1376.929058185" Dec 05 20:34:39 crc kubenswrapper[4904]: I1205 20:34:39.104036 4904 generic.go:334] "Generic (PLEG): container finished" podID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerID="083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5" exitCode=0 Dec 05 20:34:39 crc kubenswrapper[4904]: I1205 20:34:39.104361 4904 generic.go:334] "Generic (PLEG): container finished" podID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerID="dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521" exitCode=2 Dec 05 20:34:39 crc kubenswrapper[4904]: I1205 20:34:39.104372 4904 generic.go:334] "Generic (PLEG): container finished" podID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerID="097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844" exitCode=0 Dec 05 20:34:39 crc kubenswrapper[4904]: I1205 20:34:39.104094 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerDied","Data":"083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5"} Dec 05 20:34:39 crc kubenswrapper[4904]: I1205 20:34:39.104410 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerDied","Data":"dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521"} Dec 05 20:34:39 crc kubenswrapper[4904]: I1205 20:34:39.104423 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerDied","Data":"097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844"} Dec 05 20:34:43 crc kubenswrapper[4904]: I1205 20:34:43.963440 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058302 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-combined-ca-bundle\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058407 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-scripts\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058506 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-log-httpd\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058611 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7zxl\" (UniqueName: \"kubernetes.io/projected/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-kube-api-access-h7zxl\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058756 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-sg-core-conf-yaml\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058816 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-config-data\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.058927 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-run-httpd\") pod \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\" (UID: \"fbe4d2bf-3297-4b17-acbb-80f5e31c338c\") " Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.059445 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.059891 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.060483 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.060566 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.064980 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-scripts" (OuterVolumeSpecName: "scripts") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.065853 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-kube-api-access-h7zxl" (OuterVolumeSpecName: "kube-api-access-h7zxl") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "kube-api-access-h7zxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.140042 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.161802 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.162024 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.162161 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7zxl\" (UniqueName: \"kubernetes.io/projected/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-kube-api-access-h7zxl\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.166938 4904 generic.go:334] "Generic (PLEG): container finished" podID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerID="1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd" exitCode=0 Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.167001 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerDied","Data":"1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd"} Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.167037 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe4d2bf-3297-4b17-acbb-80f5e31c338c","Type":"ContainerDied","Data":"7a5e26f19c366fe5751a769f031c0e90b058f007074dd0aea3b794cb033cb00d"} Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.167170 4904 scope.go:117] "RemoveContainer" containerID="083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.167280 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.178812 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.188167 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-config-data" (OuterVolumeSpecName: "config-data") pod "fbe4d2bf-3297-4b17-acbb-80f5e31c338c" (UID: "fbe4d2bf-3297-4b17-acbb-80f5e31c338c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.219536 4904 scope.go:117] "RemoveContainer" containerID="dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.248244 4904 scope.go:117] "RemoveContainer" containerID="097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.264664 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.264695 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4d2bf-3297-4b17-acbb-80f5e31c338c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.271655 4904 scope.go:117] "RemoveContainer" containerID="1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.298789 4904 scope.go:117] "RemoveContainer" containerID="083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.299291 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5\": container with ID starting with 083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5 not found: ID does not exist" containerID="083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.299352 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5"} err="failed to get container status \"083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5\": rpc error: code = NotFound desc = could not find container \"083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5\": container with ID starting with 083dbf6347c90d8f77f32e4b2dd6bf79c9fe6dbf5fdebbee7f78d3a3d7f1ead5 not found: ID does not exist" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.299382 4904 scope.go:117] "RemoveContainer" containerID="dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.299794 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521\": container with ID starting with dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521 not found: ID does not exist" containerID="dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.299832 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521"} err="failed to get container status \"dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521\": rpc error: code = NotFound desc = could not find container \"dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521\": container with ID starting with dd8e0cbef405454618f501334af53c6b7ed974e31a9b315f8b28fb0a7963a521 not found: ID does not exist" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.299858 4904 scope.go:117] "RemoveContainer" containerID="097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.300166 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844\": container with ID starting with 097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844 not found: ID does not exist" containerID="097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.300185 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844"} err="failed to get container status \"097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844\": rpc error: code = NotFound desc = could not find container \"097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844\": container with ID starting with 097578a70335b428050400b0e30dade52d32d6b47e92573c1fa2c84fd3b4c844 not found: ID does not exist" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.300198 4904 scope.go:117] "RemoveContainer" containerID="1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.300498 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd\": container with ID starting with 1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd not found: ID does not exist" containerID="1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.300522 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd"} err="failed to get container status \"1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd\": rpc error: code = NotFound desc = could not find container \"1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd\": container with ID starting with 1450fa94c4c4281190dc52bef13dc7d539b6b2ad91df0afcaa06cc84346e2bdd not found: ID does not exist" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.532487 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.546001 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.560836 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.561304 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="sg-core" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561329 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="sg-core" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.561356 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="proxy-httpd" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561366 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="proxy-httpd" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.561381 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-notification-agent" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561389 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-notification-agent" Dec 05 20:34:44 crc kubenswrapper[4904]: E1205 20:34:44.561412 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-central-agent" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561420 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-central-agent" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561631 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="sg-core" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561650 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-notification-agent" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561672 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="proxy-httpd" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.561683 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" containerName="ceilometer-central-agent" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.563796 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.566425 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.566678 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.568388 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-config-data\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.568536 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-scripts\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.568642 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.568760 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.568879 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-log-httpd\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.569078 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-run-httpd\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.569239 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8lck\" (UniqueName: \"kubernetes.io/projected/a8423b46-6592-4473-a1f6-0d61e3b8d09a-kube-api-access-m8lck\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.586732 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670331 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-run-httpd\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670403 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8lck\" (UniqueName: \"kubernetes.io/projected/a8423b46-6592-4473-a1f6-0d61e3b8d09a-kube-api-access-m8lck\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670470 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-config-data\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670498 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-scripts\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670525 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670555 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670583 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-log-httpd\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.670941 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-run-httpd\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.671348 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-log-httpd\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.675387 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.675520 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.675915 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-config-data\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.676359 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-scripts\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.692585 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8lck\" (UniqueName: \"kubernetes.io/projected/a8423b46-6592-4473-a1f6-0d61e3b8d09a-kube-api-access-m8lck\") pod \"ceilometer-0\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " pod="openstack/ceilometer-0" Dec 05 20:34:44 crc kubenswrapper[4904]: I1205 20:34:44.887649 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:45 crc kubenswrapper[4904]: I1205 20:34:45.357219 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:45 crc kubenswrapper[4904]: W1205 20:34:45.361597 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8423b46_6592_4473_a1f6_0d61e3b8d09a.slice/crio-e093092937ef79084976c8270034331cbfdc5a983a89e7070679d4f4827ab0f1 WatchSource:0}: Error finding container e093092937ef79084976c8270034331cbfdc5a983a89e7070679d4f4827ab0f1: Status 404 returned error can't find the container with id e093092937ef79084976c8270034331cbfdc5a983a89e7070679d4f4827ab0f1 Dec 05 20:34:45 crc kubenswrapper[4904]: I1205 20:34:45.705872 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe4d2bf-3297-4b17-acbb-80f5e31c338c" path="/var/lib/kubelet/pods/fbe4d2bf-3297-4b17-acbb-80f5e31c338c/volumes" Dec 05 20:34:46 crc kubenswrapper[4904]: I1205 20:34:46.190810 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerStarted","Data":"12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5"} Dec 05 20:34:46 crc kubenswrapper[4904]: I1205 20:34:46.191120 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerStarted","Data":"dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb"} Dec 05 20:34:46 crc kubenswrapper[4904]: I1205 20:34:46.191132 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerStarted","Data":"e093092937ef79084976c8270034331cbfdc5a983a89e7070679d4f4827ab0f1"} Dec 05 20:34:47 crc kubenswrapper[4904]: I1205 20:34:47.205810 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerStarted","Data":"499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239"} Dec 05 20:34:48 crc kubenswrapper[4904]: I1205 20:34:48.214950 4904 generic.go:334] "Generic (PLEG): container finished" podID="1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" containerID="5b641c4bef02bf61d3d4d86c900709a3bb8faede9fb0a5d3d324a200a7a104f2" exitCode=0 Dec 05 20:34:48 crc kubenswrapper[4904]: I1205 20:34:48.214976 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" event={"ID":"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93","Type":"ContainerDied","Data":"5b641c4bef02bf61d3d4d86c900709a3bb8faede9fb0a5d3d324a200a7a104f2"} Dec 05 20:34:48 crc kubenswrapper[4904]: I1205 20:34:48.218248 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerStarted","Data":"462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b"} Dec 05 20:34:48 crc kubenswrapper[4904]: I1205 20:34:48.218365 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:34:48 crc kubenswrapper[4904]: I1205 20:34:48.252644 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.686448257 podStartE2EDuration="4.252628823s" podCreationTimestamp="2025-12-05 20:34:44 +0000 UTC" firstStartedPulling="2025-12-05 20:34:45.364559012 +0000 UTC m=+1384.175775121" lastFinishedPulling="2025-12-05 20:34:47.930739578 +0000 UTC m=+1386.741955687" observedRunningTime="2025-12-05 20:34:48.249505671 +0000 UTC m=+1387.060721800" watchObservedRunningTime="2025-12-05 20:34:48.252628823 +0000 UTC m=+1387.063844932" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.597113 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.676036 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-config-data\") pod \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.676106 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-combined-ca-bundle\") pod \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.676469 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-scripts\") pod \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.676588 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzrf4\" (UniqueName: \"kubernetes.io/projected/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-kube-api-access-pzrf4\") pod \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\" (UID: \"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93\") " Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.684251 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-scripts" (OuterVolumeSpecName: "scripts") pod "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" (UID: "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.700865 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-kube-api-access-pzrf4" (OuterVolumeSpecName: "kube-api-access-pzrf4") pod "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" (UID: "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93"). InnerVolumeSpecName "kube-api-access-pzrf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.710255 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" (UID: "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.716929 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-config-data" (OuterVolumeSpecName: "config-data") pod "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" (UID: "1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.779101 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.779335 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzrf4\" (UniqueName: \"kubernetes.io/projected/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-kube-api-access-pzrf4\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.779416 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.779477 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.988759 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:34:49 crc kubenswrapper[4904]: I1205 20:34:49.989041 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" containerID="cri-o://61fbc222dea28f32b8df3b258bb0ee5bbd437e2fe35806b5ef58a20b2dd014fb" gracePeriod=30 Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.238243 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" event={"ID":"1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93","Type":"ContainerDied","Data":"d50769c0cb264136df4f2ea2379f3c85a49feb229dc6cb836362f36bdd8eae24"} Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.238557 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50769c0cb264136df4f2ea2379f3c85a49feb229dc6cb836362f36bdd8eae24" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.238310 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5wbq6" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.396305 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 20:34:50 crc kubenswrapper[4904]: E1205 20:34:50.396793 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" containerName="nova-cell0-conductor-db-sync" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.396815 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" containerName="nova-cell0-conductor-db-sync" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.397089 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" containerName="nova-cell0-conductor-db-sync" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.397863 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.400440 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-85qgh" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.400473 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.420805 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.594099 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8538bb66-3a1f-40a2-bf64-b04b49318d34-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.594184 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74vxs\" (UniqueName: \"kubernetes.io/projected/8538bb66-3a1f-40a2-bf64-b04b49318d34-kube-api-access-74vxs\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.594229 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8538bb66-3a1f-40a2-bf64-b04b49318d34-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.696348 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8538bb66-3a1f-40a2-bf64-b04b49318d34-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.698424 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8538bb66-3a1f-40a2-bf64-b04b49318d34-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.698615 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74vxs\" (UniqueName: \"kubernetes.io/projected/8538bb66-3a1f-40a2-bf64-b04b49318d34-kube-api-access-74vxs\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.703827 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8538bb66-3a1f-40a2-bf64-b04b49318d34-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.706890 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8538bb66-3a1f-40a2-bf64-b04b49318d34-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:50 crc kubenswrapper[4904]: I1205 20:34:50.723448 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74vxs\" (UniqueName: \"kubernetes.io/projected/8538bb66-3a1f-40a2-bf64-b04b49318d34-kube-api-access-74vxs\") pod \"nova-cell0-conductor-0\" (UID: \"8538bb66-3a1f-40a2-bf64-b04b49318d34\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.015645 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.252519 4904 generic.go:334] "Generic (PLEG): container finished" podID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerID="61fbc222dea28f32b8df3b258bb0ee5bbd437e2fe35806b5ef58a20b2dd014fb" exitCode=0 Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.252564 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerDied","Data":"61fbc222dea28f32b8df3b258bb0ee5bbd437e2fe35806b5ef58a20b2dd014fb"} Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.252600 4904 scope.go:117] "RemoveContainer" containerID="a90f5442d2ed408c7bb656016094af84b53e976e6775ede9ff9cca02283e3e04" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.294792 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.412260 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-config-data\") pod \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.412319 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-custom-prometheus-ca\") pod \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.412355 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8f54\" (UniqueName: \"kubernetes.io/projected/28e509d6-d15b-44e6-9afa-05a347c2a7a5-kube-api-access-m8f54\") pod \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.412380 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e509d6-d15b-44e6-9afa-05a347c2a7a5-logs\") pod \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.412555 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-combined-ca-bundle\") pod \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\" (UID: \"28e509d6-d15b-44e6-9afa-05a347c2a7a5\") " Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.413339 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e509d6-d15b-44e6-9afa-05a347c2a7a5-logs" (OuterVolumeSpecName: "logs") pod "28e509d6-d15b-44e6-9afa-05a347c2a7a5" (UID: "28e509d6-d15b-44e6-9afa-05a347c2a7a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.421165 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e509d6-d15b-44e6-9afa-05a347c2a7a5-kube-api-access-m8f54" (OuterVolumeSpecName: "kube-api-access-m8f54") pod "28e509d6-d15b-44e6-9afa-05a347c2a7a5" (UID: "28e509d6-d15b-44e6-9afa-05a347c2a7a5"). InnerVolumeSpecName "kube-api-access-m8f54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.445165 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "28e509d6-d15b-44e6-9afa-05a347c2a7a5" (UID: "28e509d6-d15b-44e6-9afa-05a347c2a7a5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.448850 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28e509d6-d15b-44e6-9afa-05a347c2a7a5" (UID: "28e509d6-d15b-44e6-9afa-05a347c2a7a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.469884 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-config-data" (OuterVolumeSpecName: "config-data") pod "28e509d6-d15b-44e6-9afa-05a347c2a7a5" (UID: "28e509d6-d15b-44e6-9afa-05a347c2a7a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.514571 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.514612 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.514624 4904 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28e509d6-d15b-44e6-9afa-05a347c2a7a5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.514636 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8f54\" (UniqueName: \"kubernetes.io/projected/28e509d6-d15b-44e6-9afa-05a347c2a7a5-kube-api-access-m8f54\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.514650 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e509d6-d15b-44e6-9afa-05a347c2a7a5-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:51 crc kubenswrapper[4904]: I1205 20:34:51.559794 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.262657 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8538bb66-3a1f-40a2-bf64-b04b49318d34","Type":"ContainerStarted","Data":"84d081eb6ac0df51cc5b4a0c6fe6900ac47490820b0dc3f8c59d20e53a3663cd"} Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.262709 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8538bb66-3a1f-40a2-bf64-b04b49318d34","Type":"ContainerStarted","Data":"4ea62c504f59329487a74517c53109a4bc00f388b5ec7356919edc34a2a3bd56"} Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.263956 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.265337 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"28e509d6-d15b-44e6-9afa-05a347c2a7a5","Type":"ContainerDied","Data":"9cf97b92f3f9b85502353db4d106cfa060774d753bf85e79a8891b9bc516b61f"} Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.265411 4904 scope.go:117] "RemoveContainer" containerID="61fbc222dea28f32b8df3b258bb0ee5bbd437e2fe35806b5ef58a20b2dd014fb" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.265555 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.289254 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.289229638 podStartE2EDuration="2.289229638s" podCreationTimestamp="2025-12-05 20:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:52.282010428 +0000 UTC m=+1391.093226547" watchObservedRunningTime="2025-12-05 20:34:52.289229638 +0000 UTC m=+1391.100445747" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.325601 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.335367 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.344813 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:34:52 crc kubenswrapper[4904]: E1205 20:34:52.345385 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345403 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: E1205 20:34:52.345417 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345424 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: E1205 20:34:52.345437 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345444 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: E1205 20:34:52.345455 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345461 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345646 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345664 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.345678 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.346279 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.348777 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.365548 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.432969 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e610af6c-e57f-4676-95ec-b8accd64aea9-logs\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.433036 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-227q4\" (UniqueName: \"kubernetes.io/projected/e610af6c-e57f-4676-95ec-b8accd64aea9-kube-api-access-227q4\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.433123 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.433190 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.433483 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.536678 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.537845 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.538246 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e610af6c-e57f-4676-95ec-b8accd64aea9-logs\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.538313 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-227q4\" (UniqueName: \"kubernetes.io/projected/e610af6c-e57f-4676-95ec-b8accd64aea9-kube-api-access-227q4\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.538384 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.539186 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e610af6c-e57f-4676-95ec-b8accd64aea9-logs\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.543896 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.543915 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.554175 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e610af6c-e57f-4676-95ec-b8accd64aea9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.568782 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.569257 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-central-agent" containerID="cri-o://dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb" gracePeriod=30 Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.569603 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="proxy-httpd" containerID="cri-o://462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b" gracePeriod=30 Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.569720 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="sg-core" containerID="cri-o://499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239" gracePeriod=30 Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.569740 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-notification-agent" containerID="cri-o://12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5" gracePeriod=30 Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.582312 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-227q4\" (UniqueName: \"kubernetes.io/projected/e610af6c-e57f-4676-95ec-b8accd64aea9-kube-api-access-227q4\") pod \"watcher-decision-engine-0\" (UID: \"e610af6c-e57f-4676-95ec-b8accd64aea9\") " pod="openstack/watcher-decision-engine-0" Dec 05 20:34:52 crc kubenswrapper[4904]: I1205 20:34:52.664662 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.123859 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.277738 4904 generic.go:334] "Generic (PLEG): container finished" podID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerID="462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b" exitCode=0 Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.278085 4904 generic.go:334] "Generic (PLEG): container finished" podID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerID="499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239" exitCode=2 Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.277774 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerDied","Data":"462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b"} Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.278151 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerDied","Data":"499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239"} Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.279761 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e610af6c-e57f-4676-95ec-b8accd64aea9","Type":"ContainerStarted","Data":"eeeb9a4ef36b520364be06fdacda655919ef9c61b385ec79a8b731e9e4d855d4"} Dec 05 20:34:53 crc kubenswrapper[4904]: I1205 20:34:53.711794 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" path="/var/lib/kubelet/pods/28e509d6-d15b-44e6-9afa-05a347c2a7a5/volumes" Dec 05 20:34:54 crc kubenswrapper[4904]: I1205 20:34:54.296657 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e610af6c-e57f-4676-95ec-b8accd64aea9","Type":"ContainerStarted","Data":"aa7a4434c5e9aec894693d09184fddf15d0a0722ea6a975aad845d20d332ad72"} Dec 05 20:34:54 crc kubenswrapper[4904]: I1205 20:34:54.301484 4904 generic.go:334] "Generic (PLEG): container finished" podID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerID="12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5" exitCode=0 Dec 05 20:34:54 crc kubenswrapper[4904]: I1205 20:34:54.301768 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerDied","Data":"12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5"} Dec 05 20:34:54 crc kubenswrapper[4904]: I1205 20:34:54.323904 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.323886067 podStartE2EDuration="2.323886067s" podCreationTimestamp="2025-12-05 20:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:54.313571186 +0000 UTC m=+1393.124787325" watchObservedRunningTime="2025-12-05 20:34:54.323886067 +0000 UTC m=+1393.135102176" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.052507 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.064156 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.227721 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-scripts\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.227837 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-combined-ca-bundle\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.227890 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-config-data\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.227995 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8lck\" (UniqueName: \"kubernetes.io/projected/a8423b46-6592-4473-a1f6-0d61e3b8d09a-kube-api-access-m8lck\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.228137 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-run-httpd\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.228169 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-log-httpd\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.228239 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-sg-core-conf-yaml\") pod \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\" (UID: \"a8423b46-6592-4473-a1f6-0d61e3b8d09a\") " Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.228839 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.229596 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.234443 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8423b46-6592-4473-a1f6-0d61e3b8d09a-kube-api-access-m8lck" (OuterVolumeSpecName: "kube-api-access-m8lck") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "kube-api-access-m8lck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.240472 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-scripts" (OuterVolumeSpecName: "scripts") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.256954 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.320604 4904 generic.go:334] "Generic (PLEG): container finished" podID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerID="dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb" exitCode=0 Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.320661 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerDied","Data":"dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb"} Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.320693 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8423b46-6592-4473-a1f6-0d61e3b8d09a","Type":"ContainerDied","Data":"e093092937ef79084976c8270034331cbfdc5a983a89e7070679d4f4827ab0f1"} Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.320714 4904 scope.go:117] "RemoveContainer" containerID="462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.320887 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.324821 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.326205 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-config-data" (OuterVolumeSpecName: "config-data") pod "a8423b46-6592-4473-a1f6-0d61e3b8d09a" (UID: "a8423b46-6592-4473-a1f6-0d61e3b8d09a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330457 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330488 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8423b46-6592-4473-a1f6-0d61e3b8d09a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330499 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330511 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330520 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330528 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8423b46-6592-4473-a1f6-0d61e3b8d09a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.330537 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8lck\" (UniqueName: \"kubernetes.io/projected/a8423b46-6592-4473-a1f6-0d61e3b8d09a-kube-api-access-m8lck\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.355512 4904 scope.go:117] "RemoveContainer" containerID="499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.377786 4904 scope.go:117] "RemoveContainer" containerID="12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.399125 4904 scope.go:117] "RemoveContainer" containerID="dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.421869 4904 scope.go:117] "RemoveContainer" containerID="462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.422423 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b\": container with ID starting with 462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b not found: ID does not exist" containerID="462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.422461 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b"} err="failed to get container status \"462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b\": rpc error: code = NotFound desc = could not find container \"462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b\": container with ID starting with 462dee468f1bd28adb27d72639305a6b995590afc63968ae477bc89b11362f7b not found: ID does not exist" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.422518 4904 scope.go:117] "RemoveContainer" containerID="499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.422843 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239\": container with ID starting with 499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239 not found: ID does not exist" containerID="499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.422875 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239"} err="failed to get container status \"499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239\": rpc error: code = NotFound desc = could not find container \"499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239\": container with ID starting with 499c3796dadc470ddbf31415463936086e939f1c31d4415f2b5f46f2f21ea239 not found: ID does not exist" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.422920 4904 scope.go:117] "RemoveContainer" containerID="12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.423476 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5\": container with ID starting with 12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5 not found: ID does not exist" containerID="12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.423534 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5"} err="failed to get container status \"12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5\": rpc error: code = NotFound desc = could not find container \"12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5\": container with ID starting with 12c8d3f89306fd2a24d119c9525dcd03ab26755d0fa9b9e6fb43ef390d7e7fb5 not found: ID does not exist" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.423556 4904 scope.go:117] "RemoveContainer" containerID="dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.423788 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb\": container with ID starting with dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb not found: ID does not exist" containerID="dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.423826 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb"} err="failed to get container status \"dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb\": rpc error: code = NotFound desc = could not find container \"dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb\": container with ID starting with dcadc4266bc27c71b9e81673a0e54344fdc9112e681c933be8264b2f6bd9f3eb not found: ID does not exist" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.536397 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gksp8"] Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.537228 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="proxy-httpd" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537253 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="proxy-httpd" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.537285 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-notification-agent" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537295 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-notification-agent" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.537317 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="sg-core" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537325 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="sg-core" Dec 05 20:34:56 crc kubenswrapper[4904]: E1205 20:34:56.537344 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-central-agent" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537351 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-central-agent" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537605 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-notification-agent" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537649 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e509d6-d15b-44e6-9afa-05a347c2a7a5" containerName="watcher-decision-engine" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537664 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="proxy-httpd" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537676 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="sg-core" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.537689 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" containerName="ceilometer-central-agent" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.538582 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.549595 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gksp8"] Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.553074 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.553755 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.638094 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-config-data\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.638183 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.638224 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-scripts\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.638257 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s745\" (UniqueName: \"kubernetes.io/projected/9179375b-3935-4856-ae04-76eda9c640e8-kube-api-access-7s745\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.678517 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.680409 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.694562 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.711399 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.734607 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.742156 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-config-data\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.742239 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.742268 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-scripts\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.742303 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s745\" (UniqueName: \"kubernetes.io/projected/9179375b-3935-4856-ae04-76eda9c640e8-kube-api-access-7s745\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.749220 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.753506 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.776270 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-config-data\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.776890 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-scripts\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.777816 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.845359 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.845443 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrbn\" (UniqueName: \"kubernetes.io/projected/ea849d11-eeaa-47ec-8110-b796daf77157-kube-api-access-xbrbn\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.845537 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.881107 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.894115 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.894510 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.909275 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s745\" (UniqueName: \"kubernetes.io/projected/9179375b-3935-4856-ae04-76eda9c640e8-kube-api-access-7s745\") pod \"nova-cell0-cell-mapping-gksp8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948130 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-config-data\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948214 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-log-httpd\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948274 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948305 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8z8f\" (UniqueName: \"kubernetes.io/projected/63304fc4-9767-4efa-9002-47411daa12bd-kube-api-access-b8z8f\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948341 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrbn\" (UniqueName: \"kubernetes.io/projected/ea849d11-eeaa-47ec-8110-b796daf77157-kube-api-access-xbrbn\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948361 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-scripts\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948385 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948425 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-run-httpd\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948443 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.948469 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.953531 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:56 crc kubenswrapper[4904]: I1205 20:34:56.989167 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.000838 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrbn\" (UniqueName: \"kubernetes.io/projected/ea849d11-eeaa-47ec-8110-b796daf77157-kube-api-access-xbrbn\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.039783 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.064559 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8z8f\" (UniqueName: \"kubernetes.io/projected/63304fc4-9767-4efa-9002-47411daa12bd-kube-api-access-b8z8f\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.064925 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-scripts\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.064954 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.065050 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-run-httpd\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.065083 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.065218 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-config-data\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.065319 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-log-httpd\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.065732 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-log-httpd\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.065998 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-run-httpd\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.069363 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-scripts\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.072154 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.072832 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.076774 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.078350 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.079358 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-config-data\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.087424 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.091742 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8z8f\" (UniqueName: \"kubernetes.io/projected/63304fc4-9767-4efa-9002-47411daa12bd-kube-api-access-b8z8f\") pod \"ceilometer-0\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.092135 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.102116 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.103448 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.108289 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.115214 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.154577 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.161025 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175653 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w77tp\" (UniqueName: \"kubernetes.io/projected/5296b4f6-0c56-45d2-9133-42352ca964fc-kube-api-access-w77tp\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175697 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-config-data\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175789 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nff7v\" (UniqueName: \"kubernetes.io/projected/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-kube-api-access-nff7v\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175811 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-config-data\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175868 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-logs\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175918 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.175942 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.214130 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.216017 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.221172 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.277082 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bd8c6ff9-v2kck"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.278717 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283423 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nff7v\" (UniqueName: \"kubernetes.io/projected/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-kube-api-access-nff7v\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283465 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-config-data\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283509 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-logs\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283550 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283571 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283611 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w77tp\" (UniqueName: \"kubernetes.io/projected/5296b4f6-0c56-45d2-9133-42352ca964fc-kube-api-access-w77tp\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283633 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3410b-30ab-44a8-9dc3-924b2c13dc77-logs\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283651 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-config-data\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283676 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w99m\" (UniqueName: \"kubernetes.io/projected/3db3410b-30ab-44a8-9dc3-924b2c13dc77-kube-api-access-4w99m\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283708 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-config-data\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.283725 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.284864 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-logs\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.294116 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.299278 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-config-data\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.300547 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-config-data\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.308619 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nff7v\" (UniqueName: \"kubernetes.io/projected/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-kube-api-access-nff7v\") pod \"nova-metadata-0\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.314960 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.317589 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w77tp\" (UniqueName: \"kubernetes.io/projected/5296b4f6-0c56-45d2-9133-42352ca964fc-kube-api-access-w77tp\") pod \"nova-scheduler-0\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.351562 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.381825 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389221 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-config\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389327 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389400 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mwgz\" (UniqueName: \"kubernetes.io/projected/81d2dd5d-de80-4136-879d-faf2a6c4af16-kube-api-access-7mwgz\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389432 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-svc\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389478 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3410b-30ab-44a8-9dc3-924b2c13dc77-logs\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389504 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389534 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w99m\" (UniqueName: \"kubernetes.io/projected/3db3410b-30ab-44a8-9dc3-924b2c13dc77-kube-api-access-4w99m\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389578 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-config-data\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389600 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.389624 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.390186 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3410b-30ab-44a8-9dc3-924b2c13dc77-logs\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.400846 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-config-data\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.414145 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bd8c6ff9-v2kck"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.414642 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.430508 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.431125 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w99m\" (UniqueName: \"kubernetes.io/projected/3db3410b-30ab-44a8-9dc3-924b2c13dc77-kube-api-access-4w99m\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.431641 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.496169 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-svc\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.496353 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.496564 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.496779 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-config\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.496913 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.497049 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mwgz\" (UniqueName: \"kubernetes.io/projected/81d2dd5d-de80-4136-879d-faf2a6c4af16-kube-api-access-7mwgz\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.497130 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-svc\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.498310 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.498921 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.499413 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.499985 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-config\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.526470 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mwgz\" (UniqueName: \"kubernetes.io/projected/81d2dd5d-de80-4136-879d-faf2a6c4af16-kube-api-access-7mwgz\") pod \"dnsmasq-dns-58bd8c6ff9-v2kck\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.610824 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.670836 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.696718 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8423b46-6592-4473-a1f6-0d61e3b8d09a" path="/var/lib/kubelet/pods/a8423b46-6592-4473-a1f6-0d61e3b8d09a/volumes" Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.962858 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gksp8"] Dec 05 20:34:57 crc kubenswrapper[4904]: I1205 20:34:57.975041 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.264597 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.279483 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.290631 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:34:58 crc kubenswrapper[4904]: W1205 20:34:58.299319 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa8f2889_4d7d_42a9_b0c8_18de92b7013b.slice/crio-b57a10c455dd0025b3c4535361f7acdb460c09fcf39a82b699c3177687519ce1 WatchSource:0}: Error finding container b57a10c455dd0025b3c4535361f7acdb460c09fcf39a82b699c3177687519ce1: Status 404 returned error can't find the container with id b57a10c455dd0025b3c4535361f7acdb460c09fcf39a82b699c3177687519ce1 Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.299562 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.362140 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gtcmj"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.363559 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.365793 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.365979 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.383957 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gtcmj"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.419869 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db3410b-30ab-44a8-9dc3-924b2c13dc77","Type":"ContainerStarted","Data":"bb96c601aebe32308a0726220834b4e6868db72213ae0fa72745d751fb86848d"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.421447 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gksp8" event={"ID":"9179375b-3935-4856-ae04-76eda9c640e8","Type":"ContainerStarted","Data":"e340709f8fedd38a304878a78a010292189f0207c0d346ba8f16ce657c950188"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.421474 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gksp8" event={"ID":"9179375b-3935-4856-ae04-76eda9c640e8","Type":"ContainerStarted","Data":"f96ad8acdbb6d72b5999dc98e5a7eb101af1130249d73df03138a491ba78b6b1"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.430329 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa8f2889-4d7d-42a9-b0c8-18de92b7013b","Type":"ContainerStarted","Data":"b57a10c455dd0025b3c4535361f7acdb460c09fcf39a82b699c3177687519ce1"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.446538 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerStarted","Data":"912b2b42c13047f3df2954502536ca7c3adc090216b1a37a1dd35ab189c4e399"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.449110 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gksp8" podStartSLOduration=2.4490911730000002 podStartE2EDuration="2.449091173s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:58.446601094 +0000 UTC m=+1397.257817203" watchObservedRunningTime="2025-12-05 20:34:58.449091173 +0000 UTC m=+1397.260307282" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.457807 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea849d11-eeaa-47ec-8110-b796daf77157","Type":"ContainerStarted","Data":"a5444140aba7d83cb4d0e96ccdb9cf9e21f7678894c5e8e68feea76e2b07e427"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.462636 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5296b4f6-0c56-45d2-9133-42352ca964fc","Type":"ContainerStarted","Data":"c1eb04857839ff24f0df0222ddc0efe7deefb5c3376d87ec4127dcecd610ad6e"} Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.467070 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-scripts\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.467288 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-config-data\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.467372 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.467541 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htrnb\" (UniqueName: \"kubernetes.io/projected/9eb22604-2972-49be-8650-7d10a049f6a1-kube-api-access-htrnb\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.504194 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bd8c6ff9-v2kck"] Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.569883 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htrnb\" (UniqueName: \"kubernetes.io/projected/9eb22604-2972-49be-8650-7d10a049f6a1-kube-api-access-htrnb\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.569986 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-scripts\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.570152 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-config-data\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.570243 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.593782 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-scripts\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.594288 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.601875 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-config-data\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.605710 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htrnb\" (UniqueName: \"kubernetes.io/projected/9eb22604-2972-49be-8650-7d10a049f6a1-kube-api-access-htrnb\") pod \"nova-cell1-conductor-db-sync-gtcmj\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:58 crc kubenswrapper[4904]: I1205 20:34:58.705196 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.267791 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gtcmj"] Dec 05 20:34:59 crc kubenswrapper[4904]: W1205 20:34:59.268387 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eb22604_2972_49be_8650_7d10a049f6a1.slice/crio-84da7cf9a6d688f28269421af9fdc5161954e087f27412961db8919b5281ae1d WatchSource:0}: Error finding container 84da7cf9a6d688f28269421af9fdc5161954e087f27412961db8919b5281ae1d: Status 404 returned error can't find the container with id 84da7cf9a6d688f28269421af9fdc5161954e087f27412961db8919b5281ae1d Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.483019 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerStarted","Data":"631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145"} Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.483405 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerStarted","Data":"f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e"} Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.500169 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" event={"ID":"9eb22604-2972-49be-8650-7d10a049f6a1","Type":"ContainerStarted","Data":"84da7cf9a6d688f28269421af9fdc5161954e087f27412961db8919b5281ae1d"} Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.511575 4904 generic.go:334] "Generic (PLEG): container finished" podID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerID="d2f232c93cb3e4e4268e4b753b952516e49808cd8f34e03bdb66945b5d2f0dd6" exitCode=0 Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.512224 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" event={"ID":"81d2dd5d-de80-4136-879d-faf2a6c4af16","Type":"ContainerDied","Data":"d2f232c93cb3e4e4268e4b753b952516e49808cd8f34e03bdb66945b5d2f0dd6"} Dec 05 20:34:59 crc kubenswrapper[4904]: I1205 20:34:59.512364 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" event={"ID":"81d2dd5d-de80-4136-879d-faf2a6c4af16","Type":"ContainerStarted","Data":"2b284fc5bb19faa329706fe95acc7d4785a7475229d6aa64c76241267caa1608"} Dec 05 20:35:00 crc kubenswrapper[4904]: I1205 20:35:00.484718 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:35:00 crc kubenswrapper[4904]: I1205 20:35:00.493454 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:00 crc kubenswrapper[4904]: I1205 20:35:00.531018 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" event={"ID":"9eb22604-2972-49be-8650-7d10a049f6a1","Type":"ContainerStarted","Data":"a0846ba4670e5481c10e3461c932d9f5e3d5a6f8df5d8bc05a0058c74f0e1cc1"} Dec 05 20:35:00 crc kubenswrapper[4904]: I1205 20:35:00.552518 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" podStartSLOduration=2.5525016000000003 podStartE2EDuration="2.5525016s" podCreationTimestamp="2025-12-05 20:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:00.549018558 +0000 UTC m=+1399.360234667" watchObservedRunningTime="2025-12-05 20:35:00.5525016 +0000 UTC m=+1399.363717709" Dec 05 20:35:01 crc kubenswrapper[4904]: I1205 20:35:01.545764 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" event={"ID":"81d2dd5d-de80-4136-879d-faf2a6c4af16","Type":"ContainerStarted","Data":"e20b1c5635af73ad3f022b2250a6d02a0f2beaac421118e9022ea3e53ce02e30"} Dec 05 20:35:01 crc kubenswrapper[4904]: I1205 20:35:01.578144 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" podStartSLOduration=5.578119448 podStartE2EDuration="5.578119448s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:01.566339073 +0000 UTC m=+1400.377555192" watchObservedRunningTime="2025-12-05 20:35:01.578119448 +0000 UTC m=+1400.389335567" Dec 05 20:35:02 crc kubenswrapper[4904]: I1205 20:35:02.578743 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:35:02 crc kubenswrapper[4904]: I1205 20:35:02.664940 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 20:35:02 crc kubenswrapper[4904]: I1205 20:35:02.727415 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.589222 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa8f2889-4d7d-42a9-b0c8-18de92b7013b","Type":"ContainerStarted","Data":"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.589541 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa8f2889-4d7d-42a9-b0c8-18de92b7013b","Type":"ContainerStarted","Data":"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.589564 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-log" containerID="cri-o://e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.589697 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-metadata" containerID="cri-o://948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.592719 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerStarted","Data":"8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.595267 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea849d11-eeaa-47ec-8110-b796daf77157","Type":"ContainerStarted","Data":"a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.595326 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ea849d11-eeaa-47ec-8110-b796daf77157" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.597236 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5296b4f6-0c56-45d2-9133-42352ca964fc","Type":"ContainerStarted","Data":"f0f9ae5c03aec2962370e8f71363d77b279f1c0ebd2e18773180fb5f37d9361d"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.600995 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db3410b-30ab-44a8-9dc3-924b2c13dc77","Type":"ContainerStarted","Data":"0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.601039 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db3410b-30ab-44a8-9dc3-924b2c13dc77","Type":"ContainerStarted","Data":"82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37"} Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.601631 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.617492 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.57181605 podStartE2EDuration="7.617463607s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="2025-12-05 20:34:58.305169027 +0000 UTC m=+1397.116385136" lastFinishedPulling="2025-12-05 20:35:02.350816584 +0000 UTC m=+1401.162032693" observedRunningTime="2025-12-05 20:35:03.616986126 +0000 UTC m=+1402.428202245" watchObservedRunningTime="2025-12-05 20:35:03.617463607 +0000 UTC m=+1402.428679716" Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.649590 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.659640 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.6102943720000003 podStartE2EDuration="7.659619043s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="2025-12-05 20:34:58.286131843 +0000 UTC m=+1397.097347952" lastFinishedPulling="2025-12-05 20:35:02.335456514 +0000 UTC m=+1401.146672623" observedRunningTime="2025-12-05 20:35:03.638358205 +0000 UTC m=+1402.449574314" watchObservedRunningTime="2025-12-05 20:35:03.659619043 +0000 UTC m=+1402.470835152" Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.660145 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.268173702 podStartE2EDuration="7.660139605s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="2025-12-05 20:34:57.959421544 +0000 UTC m=+1396.770637653" lastFinishedPulling="2025-12-05 20:35:02.351387447 +0000 UTC m=+1401.162603556" observedRunningTime="2025-12-05 20:35:03.656008239 +0000 UTC m=+1402.467224358" watchObservedRunningTime="2025-12-05 20:35:03.660139605 +0000 UTC m=+1402.471355714" Dec 05 20:35:03 crc kubenswrapper[4904]: I1205 20:35:03.695848 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.681877355 podStartE2EDuration="7.695805899s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="2025-12-05 20:34:58.343628747 +0000 UTC m=+1397.154844856" lastFinishedPulling="2025-12-05 20:35:02.357557281 +0000 UTC m=+1401.168773400" observedRunningTime="2025-12-05 20:35:03.680126092 +0000 UTC m=+1402.491342211" watchObservedRunningTime="2025-12-05 20:35:03.695805899 +0000 UTC m=+1402.507022008" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.221989 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.323428 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-config-data\") pod \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.323489 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-combined-ca-bundle\") pod \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.323510 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nff7v\" (UniqueName: \"kubernetes.io/projected/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-kube-api-access-nff7v\") pod \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.323605 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-logs\") pod \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\" (UID: \"aa8f2889-4d7d-42a9-b0c8-18de92b7013b\") " Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.327367 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-logs" (OuterVolumeSpecName: "logs") pod "aa8f2889-4d7d-42a9-b0c8-18de92b7013b" (UID: "aa8f2889-4d7d-42a9-b0c8-18de92b7013b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.333264 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-kube-api-access-nff7v" (OuterVolumeSpecName: "kube-api-access-nff7v") pod "aa8f2889-4d7d-42a9-b0c8-18de92b7013b" (UID: "aa8f2889-4d7d-42a9-b0c8-18de92b7013b"). InnerVolumeSpecName "kube-api-access-nff7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.368096 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-config-data" (OuterVolumeSpecName: "config-data") pod "aa8f2889-4d7d-42a9-b0c8-18de92b7013b" (UID: "aa8f2889-4d7d-42a9-b0c8-18de92b7013b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.370150 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa8f2889-4d7d-42a9-b0c8-18de92b7013b" (UID: "aa8f2889-4d7d-42a9-b0c8-18de92b7013b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.425929 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.425967 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.425980 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nff7v\" (UniqueName: \"kubernetes.io/projected/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-kube-api-access-nff7v\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.425989 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8f2889-4d7d-42a9-b0c8-18de92b7013b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.612600 4904 generic.go:334] "Generic (PLEG): container finished" podID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerID="948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e" exitCode=0 Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.612848 4904 generic.go:334] "Generic (PLEG): container finished" podID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerID="e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d" exitCode=143 Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.613834 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.627742 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa8f2889-4d7d-42a9-b0c8-18de92b7013b","Type":"ContainerDied","Data":"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e"} Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.627810 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa8f2889-4d7d-42a9-b0c8-18de92b7013b","Type":"ContainerDied","Data":"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d"} Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.627837 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa8f2889-4d7d-42a9-b0c8-18de92b7013b","Type":"ContainerDied","Data":"b57a10c455dd0025b3c4535361f7acdb460c09fcf39a82b699c3177687519ce1"} Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.627866 4904 scope.go:117] "RemoveContainer" containerID="948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.659462 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.664816 4904 scope.go:117] "RemoveContainer" containerID="e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.671607 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.681752 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:04 crc kubenswrapper[4904]: E1205 20:35:04.682290 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-metadata" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.682316 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-metadata" Dec 05 20:35:04 crc kubenswrapper[4904]: E1205 20:35:04.682362 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-log" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.682373 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-log" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.682725 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-metadata" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.682763 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" containerName="nova-metadata-log" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.683981 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.689699 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.689979 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.692556 4904 scope.go:117] "RemoveContainer" containerID="948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e" Dec 05 20:35:04 crc kubenswrapper[4904]: E1205 20:35:04.692999 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e\": container with ID starting with 948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e not found: ID does not exist" containerID="948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.693035 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e"} err="failed to get container status \"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e\": rpc error: code = NotFound desc = could not find container \"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e\": container with ID starting with 948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e not found: ID does not exist" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.693076 4904 scope.go:117] "RemoveContainer" containerID="e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d" Dec 05 20:35:04 crc kubenswrapper[4904]: E1205 20:35:04.693544 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d\": container with ID starting with e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d not found: ID does not exist" containerID="e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.693575 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d"} err="failed to get container status \"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d\": rpc error: code = NotFound desc = could not find container \"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d\": container with ID starting with e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d not found: ID does not exist" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.693592 4904 scope.go:117] "RemoveContainer" containerID="948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.693774 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e"} err="failed to get container status \"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e\": rpc error: code = NotFound desc = could not find container \"948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e\": container with ID starting with 948adba2663b1f764347d3193c0b98b45aa5dbc8b31268510ccdb9e3a7e3c65e not found: ID does not exist" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.693798 4904 scope.go:117] "RemoveContainer" containerID="e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.694401 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d"} err="failed to get container status \"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d\": rpc error: code = NotFound desc = could not find container \"e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d\": container with ID starting with e1f3722d9c798695dfe422872831353caa725acd8632c767f2dbc0d9acd96a2d not found: ID does not exist" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.713968 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.833138 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.833231 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.833281 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e371d7d-9f4e-4666-90cc-629d73d8b606-logs\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.833316 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bhj\" (UniqueName: \"kubernetes.io/projected/6e371d7d-9f4e-4666-90cc-629d73d8b606-kube-api-access-x6bhj\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.833371 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-config-data\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.935302 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-config-data\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.935402 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.935477 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.935541 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e371d7d-9f4e-4666-90cc-629d73d8b606-logs\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.935589 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bhj\" (UniqueName: \"kubernetes.io/projected/6e371d7d-9f4e-4666-90cc-629d73d8b606-kube-api-access-x6bhj\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.936197 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e371d7d-9f4e-4666-90cc-629d73d8b606-logs\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.944497 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.944697 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.956040 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-config-data\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:04 crc kubenswrapper[4904]: I1205 20:35:04.958566 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bhj\" (UniqueName: \"kubernetes.io/projected/6e371d7d-9f4e-4666-90cc-629d73d8b606-kube-api-access-x6bhj\") pod \"nova-metadata-0\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " pod="openstack/nova-metadata-0" Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.007150 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.290395 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:05 crc kubenswrapper[4904]: W1205 20:35:05.298270 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e371d7d_9f4e_4666_90cc_629d73d8b606.slice/crio-1b03e645715f7646c8021eadfccee0d093ad6696c75f7457c8c02843b0e2f819 WatchSource:0}: Error finding container 1b03e645715f7646c8021eadfccee0d093ad6696c75f7457c8c02843b0e2f819: Status 404 returned error can't find the container with id 1b03e645715f7646c8021eadfccee0d093ad6696c75f7457c8c02843b0e2f819 Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.642684 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerStarted","Data":"c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a"} Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.643751 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.645053 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e371d7d-9f4e-4666-90cc-629d73d8b606","Type":"ContainerStarted","Data":"1b03e645715f7646c8021eadfccee0d093ad6696c75f7457c8c02843b0e2f819"} Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.664813 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.18342347 podStartE2EDuration="9.664795583s" podCreationTimestamp="2025-12-05 20:34:56 +0000 UTC" firstStartedPulling="2025-12-05 20:34:58.293793201 +0000 UTC m=+1397.105009310" lastFinishedPulling="2025-12-05 20:35:04.775165314 +0000 UTC m=+1403.586381423" observedRunningTime="2025-12-05 20:35:05.663790249 +0000 UTC m=+1404.475006368" watchObservedRunningTime="2025-12-05 20:35:05.664795583 +0000 UTC m=+1404.476011692" Dec 05 20:35:05 crc kubenswrapper[4904]: I1205 20:35:05.699724 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa8f2889-4d7d-42a9-b0c8-18de92b7013b" path="/var/lib/kubelet/pods/aa8f2889-4d7d-42a9-b0c8-18de92b7013b/volumes" Dec 05 20:35:06 crc kubenswrapper[4904]: I1205 20:35:06.667209 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e371d7d-9f4e-4666-90cc-629d73d8b606","Type":"ContainerStarted","Data":"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669"} Dec 05 20:35:06 crc kubenswrapper[4904]: I1205 20:35:06.667601 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e371d7d-9f4e-4666-90cc-629d73d8b606","Type":"ContainerStarted","Data":"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b"} Dec 05 20:35:06 crc kubenswrapper[4904]: I1205 20:35:06.693323 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.693304199 podStartE2EDuration="2.693304199s" podCreationTimestamp="2025-12-05 20:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:06.690758069 +0000 UTC m=+1405.501974188" watchObservedRunningTime="2025-12-05 20:35:06.693304199 +0000 UTC m=+1405.504520308" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.155873 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.432100 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.432160 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.467450 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.612270 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.612323 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.672318 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.676134 4904 generic.go:334] "Generic (PLEG): container finished" podID="9179375b-3935-4856-ae04-76eda9c640e8" containerID="e340709f8fedd38a304878a78a010292189f0207c0d346ba8f16ce657c950188" exitCode=0 Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.676219 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gksp8" event={"ID":"9179375b-3935-4856-ae04-76eda9c640e8","Type":"ContainerDied","Data":"e340709f8fedd38a304878a78a010292189f0207c0d346ba8f16ce657c950188"} Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.719042 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.760111 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cdd789cc5-m8v2q"] Dec 05 20:35:07 crc kubenswrapper[4904]: I1205 20:35:07.760348 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" podUID="baa0cf87-8601-4807-8453-0809354b472e" containerName="dnsmasq-dns" containerID="cri-o://aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7" gracePeriod=10 Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.315687 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.414508 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dns8\" (UniqueName: \"kubernetes.io/projected/baa0cf87-8601-4807-8453-0809354b472e-kube-api-access-5dns8\") pod \"baa0cf87-8601-4807-8453-0809354b472e\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.414578 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-config\") pod \"baa0cf87-8601-4807-8453-0809354b472e\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.414621 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-swift-storage-0\") pod \"baa0cf87-8601-4807-8453-0809354b472e\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.414673 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-nb\") pod \"baa0cf87-8601-4807-8453-0809354b472e\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.414789 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-sb\") pod \"baa0cf87-8601-4807-8453-0809354b472e\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.414897 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-svc\") pod \"baa0cf87-8601-4807-8453-0809354b472e\" (UID: \"baa0cf87-8601-4807-8453-0809354b472e\") " Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.447835 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa0cf87-8601-4807-8453-0809354b472e-kube-api-access-5dns8" (OuterVolumeSpecName: "kube-api-access-5dns8") pod "baa0cf87-8601-4807-8453-0809354b472e" (UID: "baa0cf87-8601-4807-8453-0809354b472e"). InnerVolumeSpecName "kube-api-access-5dns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.473630 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "baa0cf87-8601-4807-8453-0809354b472e" (UID: "baa0cf87-8601-4807-8453-0809354b472e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.473727 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "baa0cf87-8601-4807-8453-0809354b472e" (UID: "baa0cf87-8601-4807-8453-0809354b472e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.478797 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "baa0cf87-8601-4807-8453-0809354b472e" (UID: "baa0cf87-8601-4807-8453-0809354b472e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.486586 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "baa0cf87-8601-4807-8453-0809354b472e" (UID: "baa0cf87-8601-4807-8453-0809354b472e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.511181 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-config" (OuterVolumeSpecName: "config") pod "baa0cf87-8601-4807-8453-0809354b472e" (UID: "baa0cf87-8601-4807-8453-0809354b472e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.517008 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.517040 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dns8\" (UniqueName: \"kubernetes.io/projected/baa0cf87-8601-4807-8453-0809354b472e-kube-api-access-5dns8\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.517154 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.517172 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.517185 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.517198 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baa0cf87-8601-4807-8453-0809354b472e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.692893 4904 generic.go:334] "Generic (PLEG): container finished" podID="baa0cf87-8601-4807-8453-0809354b472e" containerID="aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7" exitCode=0 Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.692998 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" event={"ID":"baa0cf87-8601-4807-8453-0809354b472e","Type":"ContainerDied","Data":"aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7"} Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.693047 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" event={"ID":"baa0cf87-8601-4807-8453-0809354b472e","Type":"ContainerDied","Data":"a32e393240a4619023145d5ded3324ab74b02bab14aa78754acd6cd290fc7d9b"} Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.693119 4904 scope.go:117] "RemoveContainer" containerID="aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.693267 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd789cc5-m8v2q" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.694249 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.694289 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.720149 4904 scope.go:117] "RemoveContainer" containerID="f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.748278 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cdd789cc5-m8v2q"] Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.758017 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cdd789cc5-m8v2q"] Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.760769 4904 scope.go:117] "RemoveContainer" containerID="aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7" Dec 05 20:35:08 crc kubenswrapper[4904]: E1205 20:35:08.761351 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7\": container with ID starting with aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7 not found: ID does not exist" containerID="aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.761389 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7"} err="failed to get container status \"aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7\": rpc error: code = NotFound desc = could not find container \"aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7\": container with ID starting with aa8165a73b967e8674568805fc2e398dd9ddc0cd390c0b7611b50b0b2b7d41b7 not found: ID does not exist" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.761416 4904 scope.go:117] "RemoveContainer" containerID="f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5" Dec 05 20:35:08 crc kubenswrapper[4904]: E1205 20:35:08.761875 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5\": container with ID starting with f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5 not found: ID does not exist" containerID="f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5" Dec 05 20:35:08 crc kubenswrapper[4904]: I1205 20:35:08.761912 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5"} err="failed to get container status \"f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5\": rpc error: code = NotFound desc = could not find container \"f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5\": container with ID starting with f0900cc2be95ce3522e2d02bb399413d38cc119ba1ff1ae61acd36cd48e059d5 not found: ID does not exist" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.102186 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.245039 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-scripts\") pod \"9179375b-3935-4856-ae04-76eda9c640e8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.245162 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-config-data\") pod \"9179375b-3935-4856-ae04-76eda9c640e8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.245240 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s745\" (UniqueName: \"kubernetes.io/projected/9179375b-3935-4856-ae04-76eda9c640e8-kube-api-access-7s745\") pod \"9179375b-3935-4856-ae04-76eda9c640e8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.245473 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-combined-ca-bundle\") pod \"9179375b-3935-4856-ae04-76eda9c640e8\" (UID: \"9179375b-3935-4856-ae04-76eda9c640e8\") " Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.257213 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-scripts" (OuterVolumeSpecName: "scripts") pod "9179375b-3935-4856-ae04-76eda9c640e8" (UID: "9179375b-3935-4856-ae04-76eda9c640e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.257308 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9179375b-3935-4856-ae04-76eda9c640e8-kube-api-access-7s745" (OuterVolumeSpecName: "kube-api-access-7s745") pod "9179375b-3935-4856-ae04-76eda9c640e8" (UID: "9179375b-3935-4856-ae04-76eda9c640e8"). InnerVolumeSpecName "kube-api-access-7s745". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.272246 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9179375b-3935-4856-ae04-76eda9c640e8" (UID: "9179375b-3935-4856-ae04-76eda9c640e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.272644 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-config-data" (OuterVolumeSpecName: "config-data") pod "9179375b-3935-4856-ae04-76eda9c640e8" (UID: "9179375b-3935-4856-ae04-76eda9c640e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.347818 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.347855 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.347867 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9179375b-3935-4856-ae04-76eda9c640e8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.347878 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s745\" (UniqueName: \"kubernetes.io/projected/9179375b-3935-4856-ae04-76eda9c640e8-kube-api-access-7s745\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.695863 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa0cf87-8601-4807-8453-0809354b472e" path="/var/lib/kubelet/pods/baa0cf87-8601-4807-8453-0809354b472e/volumes" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.734338 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gksp8" event={"ID":"9179375b-3935-4856-ae04-76eda9c640e8","Type":"ContainerDied","Data":"f96ad8acdbb6d72b5999dc98e5a7eb101af1130249d73df03138a491ba78b6b1"} Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.734366 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gksp8" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.734373 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f96ad8acdbb6d72b5999dc98e5a7eb101af1130249d73df03138a491ba78b6b1" Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.905971 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.906280 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5296b4f6-0c56-45d2-9133-42352ca964fc" containerName="nova-scheduler-scheduler" containerID="cri-o://f0f9ae5c03aec2962370e8f71363d77b279f1c0ebd2e18773180fb5f37d9361d" gracePeriod=30 Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.936594 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.936891 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-log" containerID="cri-o://82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37" gracePeriod=30 Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.937453 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-api" containerID="cri-o://0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564" gracePeriod=30 Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.956426 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.956694 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-log" containerID="cri-o://9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b" gracePeriod=30 Dec 05 20:35:09 crc kubenswrapper[4904]: I1205 20:35:09.956856 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-metadata" containerID="cri-o://bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669" gracePeriod=30 Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.008636 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.008683 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.626961 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756199 4904 generic.go:334] "Generic (PLEG): container finished" podID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerID="bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669" exitCode=0 Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756229 4904 generic.go:334] "Generic (PLEG): container finished" podID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerID="9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b" exitCode=143 Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756267 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e371d7d-9f4e-4666-90cc-629d73d8b606","Type":"ContainerDied","Data":"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669"} Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756293 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e371d7d-9f4e-4666-90cc-629d73d8b606","Type":"ContainerDied","Data":"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b"} Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756303 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e371d7d-9f4e-4666-90cc-629d73d8b606","Type":"ContainerDied","Data":"1b03e645715f7646c8021eadfccee0d093ad6696c75f7457c8c02843b0e2f819"} Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756317 4904 scope.go:117] "RemoveContainer" containerID="bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.756437 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.760255 4904 generic.go:334] "Generic (PLEG): container finished" podID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerID="82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37" exitCode=143 Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.760336 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db3410b-30ab-44a8-9dc3-924b2c13dc77","Type":"ContainerDied","Data":"82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37"} Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.765575 4904 generic.go:334] "Generic (PLEG): container finished" podID="9eb22604-2972-49be-8650-7d10a049f6a1" containerID="a0846ba4670e5481c10e3461c932d9f5e3d5a6f8df5d8bc05a0058c74f0e1cc1" exitCode=0 Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.765627 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" event={"ID":"9eb22604-2972-49be-8650-7d10a049f6a1","Type":"ContainerDied","Data":"a0846ba4670e5481c10e3461c932d9f5e3d5a6f8df5d8bc05a0058c74f0e1cc1"} Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.773422 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-combined-ca-bundle\") pod \"6e371d7d-9f4e-4666-90cc-629d73d8b606\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.773590 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-nova-metadata-tls-certs\") pod \"6e371d7d-9f4e-4666-90cc-629d73d8b606\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.773629 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e371d7d-9f4e-4666-90cc-629d73d8b606-logs\") pod \"6e371d7d-9f4e-4666-90cc-629d73d8b606\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.773665 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-config-data\") pod \"6e371d7d-9f4e-4666-90cc-629d73d8b606\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.773741 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6bhj\" (UniqueName: \"kubernetes.io/projected/6e371d7d-9f4e-4666-90cc-629d73d8b606-kube-api-access-x6bhj\") pod \"6e371d7d-9f4e-4666-90cc-629d73d8b606\" (UID: \"6e371d7d-9f4e-4666-90cc-629d73d8b606\") " Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.775233 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e371d7d-9f4e-4666-90cc-629d73d8b606-logs" (OuterVolumeSpecName: "logs") pod "6e371d7d-9f4e-4666-90cc-629d73d8b606" (UID: "6e371d7d-9f4e-4666-90cc-629d73d8b606"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.781509 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e371d7d-9f4e-4666-90cc-629d73d8b606-kube-api-access-x6bhj" (OuterVolumeSpecName: "kube-api-access-x6bhj") pod "6e371d7d-9f4e-4666-90cc-629d73d8b606" (UID: "6e371d7d-9f4e-4666-90cc-629d73d8b606"). InnerVolumeSpecName "kube-api-access-x6bhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.789329 4904 scope.go:117] "RemoveContainer" containerID="9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.810403 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-config-data" (OuterVolumeSpecName: "config-data") pod "6e371d7d-9f4e-4666-90cc-629d73d8b606" (UID: "6e371d7d-9f4e-4666-90cc-629d73d8b606"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.814965 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e371d7d-9f4e-4666-90cc-629d73d8b606" (UID: "6e371d7d-9f4e-4666-90cc-629d73d8b606"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.821156 4904 scope.go:117] "RemoveContainer" containerID="bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669" Dec 05 20:35:10 crc kubenswrapper[4904]: E1205 20:35:10.821674 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669\": container with ID starting with bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669 not found: ID does not exist" containerID="bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.821724 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669"} err="failed to get container status \"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669\": rpc error: code = NotFound desc = could not find container \"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669\": container with ID starting with bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669 not found: ID does not exist" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.821755 4904 scope.go:117] "RemoveContainer" containerID="9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b" Dec 05 20:35:10 crc kubenswrapper[4904]: E1205 20:35:10.822360 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b\": container with ID starting with 9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b not found: ID does not exist" containerID="9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.822409 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b"} err="failed to get container status \"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b\": rpc error: code = NotFound desc = could not find container \"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b\": container with ID starting with 9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b not found: ID does not exist" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.822437 4904 scope.go:117] "RemoveContainer" containerID="bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.822743 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669"} err="failed to get container status \"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669\": rpc error: code = NotFound desc = could not find container \"bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669\": container with ID starting with bab73146f19798332952795b613a6d25d2c2b670cda58b66afc65fe01cbb0669 not found: ID does not exist" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.822769 4904 scope.go:117] "RemoveContainer" containerID="9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.823091 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b"} err="failed to get container status \"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b\": rpc error: code = NotFound desc = could not find container \"9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b\": container with ID starting with 9a1a0103e68ee89b787f7da0a8efb1e3c7cc8f21cc095802d075b152604c171b not found: ID does not exist" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.851427 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6e371d7d-9f4e-4666-90cc-629d73d8b606" (UID: "6e371d7d-9f4e-4666-90cc-629d73d8b606"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.876276 4904 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.876307 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e371d7d-9f4e-4666-90cc-629d73d8b606-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.876317 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.876325 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6bhj\" (UniqueName: \"kubernetes.io/projected/6e371d7d-9f4e-4666-90cc-629d73d8b606-kube-api-access-x6bhj\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:10 crc kubenswrapper[4904]: I1205 20:35:10.876333 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e371d7d-9f4e-4666-90cc-629d73d8b606-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.113585 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.133840 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147016 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:11 crc kubenswrapper[4904]: E1205 20:35:11.147507 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa0cf87-8601-4807-8453-0809354b472e" containerName="dnsmasq-dns" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147525 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa0cf87-8601-4807-8453-0809354b472e" containerName="dnsmasq-dns" Dec 05 20:35:11 crc kubenswrapper[4904]: E1205 20:35:11.147543 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-metadata" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147551 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-metadata" Dec 05 20:35:11 crc kubenswrapper[4904]: E1205 20:35:11.147566 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa0cf87-8601-4807-8453-0809354b472e" containerName="init" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147574 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa0cf87-8601-4807-8453-0809354b472e" containerName="init" Dec 05 20:35:11 crc kubenswrapper[4904]: E1205 20:35:11.147615 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-log" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147622 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-log" Dec 05 20:35:11 crc kubenswrapper[4904]: E1205 20:35:11.147631 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9179375b-3935-4856-ae04-76eda9c640e8" containerName="nova-manage" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147639 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9179375b-3935-4856-ae04-76eda9c640e8" containerName="nova-manage" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147853 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="9179375b-3935-4856-ae04-76eda9c640e8" containerName="nova-manage" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147874 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-log" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147886 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" containerName="nova-metadata-metadata" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.147904 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa0cf87-8601-4807-8453-0809354b472e" containerName="dnsmasq-dns" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.149093 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.151584 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.159852 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.165075 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.283364 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.283468 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.283536 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-config-data\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.283591 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq96w\" (UniqueName: \"kubernetes.io/projected/6c381934-7d5f-429f-bf7c-294241e82ae3-kube-api-access-nq96w\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.283610 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c381934-7d5f-429f-bf7c-294241e82ae3-logs\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.385795 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq96w\" (UniqueName: \"kubernetes.io/projected/6c381934-7d5f-429f-bf7c-294241e82ae3-kube-api-access-nq96w\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.385853 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c381934-7d5f-429f-bf7c-294241e82ae3-logs\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.385973 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.386091 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.386170 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-config-data\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.387510 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c381934-7d5f-429f-bf7c-294241e82ae3-logs\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.391197 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.391425 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-config-data\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.393566 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.411897 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq96w\" (UniqueName: \"kubernetes.io/projected/6c381934-7d5f-429f-bf7c-294241e82ae3-kube-api-access-nq96w\") pod \"nova-metadata-0\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.487983 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.701266 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e371d7d-9f4e-4666-90cc-629d73d8b606" path="/var/lib/kubelet/pods/6e371d7d-9f4e-4666-90cc-629d73d8b606/volumes" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.778162 4904 generic.go:334] "Generic (PLEG): container finished" podID="5296b4f6-0c56-45d2-9133-42352ca964fc" containerID="f0f9ae5c03aec2962370e8f71363d77b279f1c0ebd2e18773180fb5f37d9361d" exitCode=0 Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.778212 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5296b4f6-0c56-45d2-9133-42352ca964fc","Type":"ContainerDied","Data":"f0f9ae5c03aec2962370e8f71363d77b279f1c0ebd2e18773180fb5f37d9361d"} Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.778238 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5296b4f6-0c56-45d2-9133-42352ca964fc","Type":"ContainerDied","Data":"c1eb04857839ff24f0df0222ddc0efe7deefb5c3376d87ec4127dcecd610ad6e"} Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.778248 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1eb04857839ff24f0df0222ddc0efe7deefb5c3376d87ec4127dcecd610ad6e" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.778686 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.895694 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w77tp\" (UniqueName: \"kubernetes.io/projected/5296b4f6-0c56-45d2-9133-42352ca964fc-kube-api-access-w77tp\") pod \"5296b4f6-0c56-45d2-9133-42352ca964fc\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.895771 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-combined-ca-bundle\") pod \"5296b4f6-0c56-45d2-9133-42352ca964fc\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.895819 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-config-data\") pod \"5296b4f6-0c56-45d2-9133-42352ca964fc\" (UID: \"5296b4f6-0c56-45d2-9133-42352ca964fc\") " Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.900835 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5296b4f6-0c56-45d2-9133-42352ca964fc-kube-api-access-w77tp" (OuterVolumeSpecName: "kube-api-access-w77tp") pod "5296b4f6-0c56-45d2-9133-42352ca964fc" (UID: "5296b4f6-0c56-45d2-9133-42352ca964fc"). InnerVolumeSpecName "kube-api-access-w77tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.927246 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-config-data" (OuterVolumeSpecName: "config-data") pod "5296b4f6-0c56-45d2-9133-42352ca964fc" (UID: "5296b4f6-0c56-45d2-9133-42352ca964fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.945458 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5296b4f6-0c56-45d2-9133-42352ca964fc" (UID: "5296b4f6-0c56-45d2-9133-42352ca964fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:11 crc kubenswrapper[4904]: I1205 20:35:11.986576 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.006490 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w77tp\" (UniqueName: \"kubernetes.io/projected/5296b4f6-0c56-45d2-9133-42352ca964fc-kube-api-access-w77tp\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.006523 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.006534 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5296b4f6-0c56-45d2-9133-42352ca964fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.297457 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.415417 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htrnb\" (UniqueName: \"kubernetes.io/projected/9eb22604-2972-49be-8650-7d10a049f6a1-kube-api-access-htrnb\") pod \"9eb22604-2972-49be-8650-7d10a049f6a1\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.415711 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-config-data\") pod \"9eb22604-2972-49be-8650-7d10a049f6a1\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.415915 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-scripts\") pod \"9eb22604-2972-49be-8650-7d10a049f6a1\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.416030 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-combined-ca-bundle\") pod \"9eb22604-2972-49be-8650-7d10a049f6a1\" (UID: \"9eb22604-2972-49be-8650-7d10a049f6a1\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.423687 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb22604-2972-49be-8650-7d10a049f6a1-kube-api-access-htrnb" (OuterVolumeSpecName: "kube-api-access-htrnb") pod "9eb22604-2972-49be-8650-7d10a049f6a1" (UID: "9eb22604-2972-49be-8650-7d10a049f6a1"). InnerVolumeSpecName "kube-api-access-htrnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.423882 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-scripts" (OuterVolumeSpecName: "scripts") pod "9eb22604-2972-49be-8650-7d10a049f6a1" (UID: "9eb22604-2972-49be-8650-7d10a049f6a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.450105 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9eb22604-2972-49be-8650-7d10a049f6a1" (UID: "9eb22604-2972-49be-8650-7d10a049f6a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.459262 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-config-data" (OuterVolumeSpecName: "config-data") pod "9eb22604-2972-49be-8650-7d10a049f6a1" (UID: "9eb22604-2972-49be-8650-7d10a049f6a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.517747 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htrnb\" (UniqueName: \"kubernetes.io/projected/9eb22604-2972-49be-8650-7d10a049f6a1-kube-api-access-htrnb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.518354 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.518471 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.518555 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb22604-2972-49be-8650-7d10a049f6a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.532863 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.620735 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w99m\" (UniqueName: \"kubernetes.io/projected/3db3410b-30ab-44a8-9dc3-924b2c13dc77-kube-api-access-4w99m\") pod \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.621400 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-config-data\") pod \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.621436 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-combined-ca-bundle\") pod \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.621476 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3410b-30ab-44a8-9dc3-924b2c13dc77-logs\") pod \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\" (UID: \"3db3410b-30ab-44a8-9dc3-924b2c13dc77\") " Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.622367 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db3410b-30ab-44a8-9dc3-924b2c13dc77-logs" (OuterVolumeSpecName: "logs") pod "3db3410b-30ab-44a8-9dc3-924b2c13dc77" (UID: "3db3410b-30ab-44a8-9dc3-924b2c13dc77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.622663 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3410b-30ab-44a8-9dc3-924b2c13dc77-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.623300 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db3410b-30ab-44a8-9dc3-924b2c13dc77-kube-api-access-4w99m" (OuterVolumeSpecName: "kube-api-access-4w99m") pod "3db3410b-30ab-44a8-9dc3-924b2c13dc77" (UID: "3db3410b-30ab-44a8-9dc3-924b2c13dc77"). InnerVolumeSpecName "kube-api-access-4w99m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.649052 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db3410b-30ab-44a8-9dc3-924b2c13dc77" (UID: "3db3410b-30ab-44a8-9dc3-924b2c13dc77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.650731 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-config-data" (OuterVolumeSpecName: "config-data") pod "3db3410b-30ab-44a8-9dc3-924b2c13dc77" (UID: "3db3410b-30ab-44a8-9dc3-924b2c13dc77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.725662 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.725700 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3410b-30ab-44a8-9dc3-924b2c13dc77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.725715 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w99m\" (UniqueName: \"kubernetes.io/projected/3db3410b-30ab-44a8-9dc3-924b2c13dc77-kube-api-access-4w99m\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.802407 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" event={"ID":"9eb22604-2972-49be-8650-7d10a049f6a1","Type":"ContainerDied","Data":"84da7cf9a6d688f28269421af9fdc5161954e087f27412961db8919b5281ae1d"} Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.802456 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84da7cf9a6d688f28269421af9fdc5161954e087f27412961db8919b5281ae1d" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.802528 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gtcmj" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.826114 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c381934-7d5f-429f-bf7c-294241e82ae3","Type":"ContainerStarted","Data":"a7ee02175d8dd9b1ac2fd3b614fb0f3602487d1388e3a6b141f8a670b12429cd"} Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.826255 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c381934-7d5f-429f-bf7c-294241e82ae3","Type":"ContainerStarted","Data":"dc7bbc8220797ffa8faf28635768dc41fddf0aac4d2a1edd6a35fa1247ad6296"} Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.826324 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c381934-7d5f-429f-bf7c-294241e82ae3","Type":"ContainerStarted","Data":"0f88a2107e3ea8b328114c45cea6b873b83da9985fb2de07f1d507ea4cf38f89"} Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.830004 4904 generic.go:334] "Generic (PLEG): container finished" podID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerID="0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564" exitCode=0 Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.830123 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db3410b-30ab-44a8-9dc3-924b2c13dc77","Type":"ContainerDied","Data":"0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564"} Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.830169 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db3410b-30ab-44a8-9dc3-924b2c13dc77","Type":"ContainerDied","Data":"bb96c601aebe32308a0726220834b4e6868db72213ae0fa72745d751fb86848d"} Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.830191 4904 scope.go:117] "RemoveContainer" containerID="0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.830098 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.830334 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.888803 4904 scope.go:117] "RemoveContainer" containerID="82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.898513 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.898494104 podStartE2EDuration="1.898494104s" podCreationTimestamp="2025-12-05 20:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:12.852955729 +0000 UTC m=+1411.664171848" watchObservedRunningTime="2025-12-05 20:35:12.898494104 +0000 UTC m=+1411.709710213" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.929119 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 20:35:12 crc kubenswrapper[4904]: E1205 20:35:12.929646 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-log" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.929667 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-log" Dec 05 20:35:12 crc kubenswrapper[4904]: E1205 20:35:12.929694 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-api" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.929703 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-api" Dec 05 20:35:12 crc kubenswrapper[4904]: E1205 20:35:12.929743 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5296b4f6-0c56-45d2-9133-42352ca964fc" containerName="nova-scheduler-scheduler" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.929753 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="5296b4f6-0c56-45d2-9133-42352ca964fc" containerName="nova-scheduler-scheduler" Dec 05 20:35:12 crc kubenswrapper[4904]: E1205 20:35:12.929772 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb22604-2972-49be-8650-7d10a049f6a1" containerName="nova-cell1-conductor-db-sync" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.929780 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb22604-2972-49be-8650-7d10a049f6a1" containerName="nova-cell1-conductor-db-sync" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.930027 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-log" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.930051 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb22604-2972-49be-8650-7d10a049f6a1" containerName="nova-cell1-conductor-db-sync" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.930085 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="5296b4f6-0c56-45d2-9133-42352ca964fc" containerName="nova-scheduler-scheduler" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.930118 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" containerName="nova-api-api" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.930986 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.933561 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.933895 4904 scope.go:117] "RemoveContainer" containerID="0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564" Dec 05 20:35:12 crc kubenswrapper[4904]: E1205 20:35:12.936166 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564\": container with ID starting with 0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564 not found: ID does not exist" containerID="0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.936212 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564"} err="failed to get container status \"0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564\": rpc error: code = NotFound desc = could not find container \"0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564\": container with ID starting with 0bb1a2bd0e5331883b5085f14049875d0afb99092c05dca515f17959223e1564 not found: ID does not exist" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.936238 4904 scope.go:117] "RemoveContainer" containerID="82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37" Dec 05 20:35:12 crc kubenswrapper[4904]: E1205 20:35:12.936585 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37\": container with ID starting with 82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37 not found: ID does not exist" containerID="82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.936599 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37"} err="failed to get container status \"82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37\": rpc error: code = NotFound desc = could not find container \"82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37\": container with ID starting with 82ab3f46917ee4f38f68ebf635b3155ed84f30b9595c38e3ef756e1ac61c0b37 not found: ID does not exist" Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.950763 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.962031 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.972657 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.985304 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:12 crc kubenswrapper[4904]: I1205 20:35:12.993301 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.013527 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.016086 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.018994 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.031457 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.037891 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3f6c470-722e-4a05-a450-e65791498b79-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.037961 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x264\" (UniqueName: \"kubernetes.io/projected/a3f6c470-722e-4a05-a450-e65791498b79-kube-api-access-7x264\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.038034 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3f6c470-722e-4a05-a450-e65791498b79-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.043552 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.045817 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.052817 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.055951 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139583 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3f6c470-722e-4a05-a450-e65791498b79-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139630 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrbz\" (UniqueName: \"kubernetes.io/projected/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-kube-api-access-jkrbz\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139657 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-config-data\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139678 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x264\" (UniqueName: \"kubernetes.io/projected/a3f6c470-722e-4a05-a450-e65791498b79-kube-api-access-7x264\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139725 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-logs\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139762 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139821 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3f6c470-722e-4a05-a450-e65791498b79-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139837 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpkhb\" (UniqueName: \"kubernetes.io/projected/1af764d9-33ef-4e99-8757-db2cd6705952-kube-api-access-zpkhb\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139861 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-config-data\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.139904 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.144425 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3f6c470-722e-4a05-a450-e65791498b79-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.144623 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3f6c470-722e-4a05-a450-e65791498b79-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.158299 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x264\" (UniqueName: \"kubernetes.io/projected/a3f6c470-722e-4a05-a450-e65791498b79-kube-api-access-7x264\") pod \"nova-cell1-conductor-0\" (UID: \"a3f6c470-722e-4a05-a450-e65791498b79\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.241744 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrbz\" (UniqueName: \"kubernetes.io/projected/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-kube-api-access-jkrbz\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.242122 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-config-data\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.242261 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-logs\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.242394 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.242503 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpkhb\" (UniqueName: \"kubernetes.io/projected/1af764d9-33ef-4e99-8757-db2cd6705952-kube-api-access-zpkhb\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.242664 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-config-data\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.242937 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.243299 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-logs\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.246341 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.247094 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-config-data\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.247442 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.250022 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-config-data\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.258322 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.259236 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpkhb\" (UniqueName: \"kubernetes.io/projected/1af764d9-33ef-4e99-8757-db2cd6705952-kube-api-access-zpkhb\") pod \"nova-scheduler-0\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.261608 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrbz\" (UniqueName: \"kubernetes.io/projected/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-kube-api-access-jkrbz\") pod \"nova-api-0\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.337705 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.428316 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.761244 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db3410b-30ab-44a8-9dc3-924b2c13dc77" path="/var/lib/kubelet/pods/3db3410b-30ab-44a8-9dc3-924b2c13dc77/volumes" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.761943 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5296b4f6-0c56-45d2-9133-42352ca964fc" path="/var/lib/kubelet/pods/5296b4f6-0c56-45d2-9133-42352ca964fc/volumes" Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.762840 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.843450 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.858198 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:13 crc kubenswrapper[4904]: I1205 20:35:13.859058 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a3f6c470-722e-4a05-a450-e65791498b79","Type":"ContainerStarted","Data":"4596dad018049df0c4209957d805156c52a618f86de46f35ab744c75b1c13a9e"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.929205 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1af764d9-33ef-4e99-8757-db2cd6705952","Type":"ContainerStarted","Data":"852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.929806 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1af764d9-33ef-4e99-8757-db2cd6705952","Type":"ContainerStarted","Data":"8e8b8e4c7794ef5b786faf3306292b7ab6a98a63c4e79ba57713b98be20a1ce9"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.945696 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a3f6c470-722e-4a05-a450-e65791498b79","Type":"ContainerStarted","Data":"191045eb44d1ae839ece2cd0ee077472afd3133aa3692b213fb6d6ecd7082d1a"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.947068 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.955271 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f","Type":"ContainerStarted","Data":"f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.955320 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f","Type":"ContainerStarted","Data":"26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.955338 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f","Type":"ContainerStarted","Data":"404229fce7fd5d06999ed94bf8b4cbdf9af32534a8ddba544e275b512724a0ad"} Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.973773 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.973749723 podStartE2EDuration="2.973749723s" podCreationTimestamp="2025-12-05 20:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:14.949144168 +0000 UTC m=+1413.760360297" watchObservedRunningTime="2025-12-05 20:35:14.973749723 +0000 UTC m=+1413.784965832" Dec 05 20:35:14 crc kubenswrapper[4904]: I1205 20:35:14.978080 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.978050103 podStartE2EDuration="2.978050103s" podCreationTimestamp="2025-12-05 20:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:14.965345557 +0000 UTC m=+1413.776561686" watchObservedRunningTime="2025-12-05 20:35:14.978050103 +0000 UTC m=+1413.789266222" Dec 05 20:35:15 crc kubenswrapper[4904]: I1205 20:35:15.000392 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.000372345 podStartE2EDuration="3.000372345s" podCreationTimestamp="2025-12-05 20:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:14.985210601 +0000 UTC m=+1413.796426760" watchObservedRunningTime="2025-12-05 20:35:15.000372345 +0000 UTC m=+1413.811588464" Dec 05 20:35:16 crc kubenswrapper[4904]: I1205 20:35:16.488935 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:35:16 crc kubenswrapper[4904]: I1205 20:35:16.491275 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:35:18 crc kubenswrapper[4904]: I1205 20:35:18.291361 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 20:35:18 crc kubenswrapper[4904]: I1205 20:35:18.339482 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 20:35:21 crc kubenswrapper[4904]: I1205 20:35:21.488796 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:35:21 crc kubenswrapper[4904]: I1205 20:35:21.489088 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:35:22 crc kubenswrapper[4904]: I1205 20:35:22.500583 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:22 crc kubenswrapper[4904]: I1205 20:35:22.500582 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:23 crc kubenswrapper[4904]: I1205 20:35:23.339461 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 20:35:23 crc kubenswrapper[4904]: I1205 20:35:23.368239 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 20:35:23 crc kubenswrapper[4904]: I1205 20:35:23.429549 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:35:23 crc kubenswrapper[4904]: I1205 20:35:23.431108 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:35:24 crc kubenswrapper[4904]: I1205 20:35:24.130439 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 20:35:24 crc kubenswrapper[4904]: I1205 20:35:24.511440 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:24 crc kubenswrapper[4904]: I1205 20:35:24.511542 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:27 crc kubenswrapper[4904]: I1205 20:35:27.399669 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 20:35:29 crc kubenswrapper[4904]: I1205 20:35:29.956309 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:35:29 crc kubenswrapper[4904]: I1205 20:35:29.956383 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.075310 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.075827 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1893f1e0-90ea-4bd5-9275-c0266042485d" containerName="kube-state-metrics" containerID="cri-o://25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf" gracePeriod=30 Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.493330 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.501931 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.502043 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.700637 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.881793 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bq9d\" (UniqueName: \"kubernetes.io/projected/1893f1e0-90ea-4bd5-9275-c0266042485d-kube-api-access-8bq9d\") pod \"1893f1e0-90ea-4bd5-9275-c0266042485d\" (UID: \"1893f1e0-90ea-4bd5-9275-c0266042485d\") " Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.895858 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1893f1e0-90ea-4bd5-9275-c0266042485d-kube-api-access-8bq9d" (OuterVolumeSpecName: "kube-api-access-8bq9d") pod "1893f1e0-90ea-4bd5-9275-c0266042485d" (UID: "1893f1e0-90ea-4bd5-9275-c0266042485d"). InnerVolumeSpecName "kube-api-access-8bq9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:31 crc kubenswrapper[4904]: I1205 20:35:31.984490 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bq9d\" (UniqueName: \"kubernetes.io/projected/1893f1e0-90ea-4bd5-9275-c0266042485d-kube-api-access-8bq9d\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.173475 4904 generic.go:334] "Generic (PLEG): container finished" podID="1893f1e0-90ea-4bd5-9275-c0266042485d" containerID="25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf" exitCode=2 Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.173523 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.173532 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1893f1e0-90ea-4bd5-9275-c0266042485d","Type":"ContainerDied","Data":"25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf"} Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.173580 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1893f1e0-90ea-4bd5-9275-c0266042485d","Type":"ContainerDied","Data":"816934fa82933f2e9a09a6f7f889cb675265239e0364d97cb696e05bf7de0c8d"} Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.173598 4904 scope.go:117] "RemoveContainer" containerID="25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.182576 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.196717 4904 scope.go:117] "RemoveContainer" containerID="25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf" Dec 05 20:35:32 crc kubenswrapper[4904]: E1205 20:35:32.197319 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf\": container with ID starting with 25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf not found: ID does not exist" containerID="25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.197367 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf"} err="failed to get container status \"25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf\": rpc error: code = NotFound desc = could not find container \"25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf\": container with ID starting with 25fcb1cef4e3e78341234772b01e9d5502266566c89583dc061afe3493a4eabf not found: ID does not exist" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.222222 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.232443 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.254568 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:35:32 crc kubenswrapper[4904]: E1205 20:35:32.255089 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1893f1e0-90ea-4bd5-9275-c0266042485d" containerName="kube-state-metrics" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.255108 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1893f1e0-90ea-4bd5-9275-c0266042485d" containerName="kube-state-metrics" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.255312 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1893f1e0-90ea-4bd5-9275-c0266042485d" containerName="kube-state-metrics" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.255992 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.258503 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.259802 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.280915 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.391487 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.391545 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.391586 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbrfq\" (UniqueName: \"kubernetes.io/projected/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-api-access-cbrfq\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.391626 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.492778 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.492934 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.492988 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.493037 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrfq\" (UniqueName: \"kubernetes.io/projected/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-api-access-cbrfq\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.498176 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.500229 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.508741 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.513469 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbrfq\" (UniqueName: \"kubernetes.io/projected/cd4e564c-e066-405d-92e5-1d312bfd1f57-kube-api-access-cbrfq\") pod \"kube-state-metrics-0\" (UID: \"cd4e564c-e066-405d-92e5-1d312bfd1f57\") " pod="openstack/kube-state-metrics-0" Dec 05 20:35:32 crc kubenswrapper[4904]: I1205 20:35:32.578736 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.017258 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:35:33 crc kubenswrapper[4904]: W1205 20:35:33.021322 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd4e564c_e066_405d_92e5_1d312bfd1f57.slice/crio-92e23630e293af475a942acd03196fe58fbb20c23374bf61e4428cd609378aa6 WatchSource:0}: Error finding container 92e23630e293af475a942acd03196fe58fbb20c23374bf61e4428cd609378aa6: Status 404 returned error can't find the container with id 92e23630e293af475a942acd03196fe58fbb20c23374bf61e4428cd609378aa6 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.043241 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.043483 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-central-agent" containerID="cri-o://f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e" gracePeriod=30 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.043653 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="sg-core" containerID="cri-o://8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e" gracePeriod=30 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.043748 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="proxy-httpd" containerID="cri-o://c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a" gracePeriod=30 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.043826 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-notification-agent" containerID="cri-o://631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145" gracePeriod=30 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.188564 4904 generic.go:334] "Generic (PLEG): container finished" podID="63304fc4-9767-4efa-9002-47411daa12bd" containerID="c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a" exitCode=0 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.188842 4904 generic.go:334] "Generic (PLEG): container finished" podID="63304fc4-9767-4efa-9002-47411daa12bd" containerID="8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e" exitCode=2 Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.188641 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerDied","Data":"c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a"} Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.188919 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerDied","Data":"8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e"} Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.190458 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd4e564c-e066-405d-92e5-1d312bfd1f57","Type":"ContainerStarted","Data":"92e23630e293af475a942acd03196fe58fbb20c23374bf61e4428cd609378aa6"} Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.437026 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.437544 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.443879 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.444009 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:35:33 crc kubenswrapper[4904]: I1205 20:35:33.694301 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1893f1e0-90ea-4bd5-9275-c0266042485d" path="/var/lib/kubelet/pods/1893f1e0-90ea-4bd5-9275-c0266042485d/volumes" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.078920 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.210238 4904 generic.go:334] "Generic (PLEG): container finished" podID="63304fc4-9767-4efa-9002-47411daa12bd" containerID="f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e" exitCode=0 Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.210307 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerDied","Data":"f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e"} Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.211915 4904 generic.go:334] "Generic (PLEG): container finished" podID="ea849d11-eeaa-47ec-8110-b796daf77157" containerID="a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d" exitCode=137 Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.211958 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.211980 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea849d11-eeaa-47ec-8110-b796daf77157","Type":"ContainerDied","Data":"a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d"} Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.212008 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea849d11-eeaa-47ec-8110-b796daf77157","Type":"ContainerDied","Data":"a5444140aba7d83cb4d0e96ccdb9cf9e21f7678894c5e8e68feea76e2b07e427"} Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.212024 4904 scope.go:117] "RemoveContainer" containerID="a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.217212 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd4e564c-e066-405d-92e5-1d312bfd1f57","Type":"ContainerStarted","Data":"749129f7e6a3d8ac4dc76cff8afab0d5104bcc35dc1b2951ccb0dfcc11a0ca1f"} Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.217252 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.218179 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.224848 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-config-data\") pod \"ea849d11-eeaa-47ec-8110-b796daf77157\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.225099 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-combined-ca-bundle\") pod \"ea849d11-eeaa-47ec-8110-b796daf77157\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.225143 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrbn\" (UniqueName: \"kubernetes.io/projected/ea849d11-eeaa-47ec-8110-b796daf77157-kube-api-access-xbrbn\") pod \"ea849d11-eeaa-47ec-8110-b796daf77157\" (UID: \"ea849d11-eeaa-47ec-8110-b796daf77157\") " Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.225242 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.230447 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea849d11-eeaa-47ec-8110-b796daf77157-kube-api-access-xbrbn" (OuterVolumeSpecName: "kube-api-access-xbrbn") pod "ea849d11-eeaa-47ec-8110-b796daf77157" (UID: "ea849d11-eeaa-47ec-8110-b796daf77157"). InnerVolumeSpecName "kube-api-access-xbrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.268577 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.911719325 podStartE2EDuration="2.268559358s" podCreationTimestamp="2025-12-05 20:35:32 +0000 UTC" firstStartedPulling="2025-12-05 20:35:33.023959679 +0000 UTC m=+1431.835175788" lastFinishedPulling="2025-12-05 20:35:33.380799712 +0000 UTC m=+1432.192015821" observedRunningTime="2025-12-05 20:35:34.236730794 +0000 UTC m=+1433.047946903" watchObservedRunningTime="2025-12-05 20:35:34.268559358 +0000 UTC m=+1433.079775467" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.276890 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-config-data" (OuterVolumeSpecName: "config-data") pod "ea849d11-eeaa-47ec-8110-b796daf77157" (UID: "ea849d11-eeaa-47ec-8110-b796daf77157"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.280736 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea849d11-eeaa-47ec-8110-b796daf77157" (UID: "ea849d11-eeaa-47ec-8110-b796daf77157"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.292219 4904 scope.go:117] "RemoveContainer" containerID="a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d" Dec 05 20:35:34 crc kubenswrapper[4904]: E1205 20:35:34.293570 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d\": container with ID starting with a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d not found: ID does not exist" containerID="a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.293602 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d"} err="failed to get container status \"a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d\": rpc error: code = NotFound desc = could not find container \"a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d\": container with ID starting with a8a39402970cedec8b2f9e2e0cb4c89d6905421e8c40a99e7e7b5caf851c784d not found: ID does not exist" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.330647 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.330689 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea849d11-eeaa-47ec-8110-b796daf77157-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.330713 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrbn\" (UniqueName: \"kubernetes.io/projected/ea849d11-eeaa-47ec-8110-b796daf77157-kube-api-access-xbrbn\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.422176 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7774fc8c79-ms4p7"] Dec 05 20:35:34 crc kubenswrapper[4904]: E1205 20:35:34.422607 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea849d11-eeaa-47ec-8110-b796daf77157" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.422625 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea849d11-eeaa-47ec-8110-b796daf77157" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.422867 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea849d11-eeaa-47ec-8110-b796daf77157" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.423926 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.431122 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7774fc8c79-ms4p7"] Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.533524 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vln5\" (UniqueName: \"kubernetes.io/projected/a252ec50-71cf-4e22-b84a-8c247b695354-kube-api-access-5vln5\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.533819 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-config\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.533855 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-svc\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.533910 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-sb\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.533932 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-nb\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.534002 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-swift-storage-0\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.607258 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.621154 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.633191 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.634757 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.635452 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-swift-storage-0\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.635524 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vln5\" (UniqueName: \"kubernetes.io/projected/a252ec50-71cf-4e22-b84a-8c247b695354-kube-api-access-5vln5\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.635552 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-config\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.635590 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-svc\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.635656 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-sb\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.635683 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-nb\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.636854 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-swift-storage-0\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.637030 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-config\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.637648 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-svc\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.638827 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.639112 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.639524 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.639747 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-sb\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.640797 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-nb\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.645936 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.670037 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vln5\" (UniqueName: \"kubernetes.io/projected/a252ec50-71cf-4e22-b84a-8c247b695354-kube-api-access-5vln5\") pod \"dnsmasq-dns-7774fc8c79-ms4p7\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.737835 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bgpt\" (UniqueName: \"kubernetes.io/projected/8da8babf-f15a-4f70-bf70-23218ca0628a-kube-api-access-6bgpt\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.737950 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.737975 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.738007 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.738024 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.748179 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.839917 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.839981 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.840053 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.840221 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.840422 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bgpt\" (UniqueName: \"kubernetes.io/projected/8da8babf-f15a-4f70-bf70-23218ca0628a-kube-api-access-6bgpt\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.844367 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.844402 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.846259 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.846442 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da8babf-f15a-4f70-bf70-23218ca0628a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.860580 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bgpt\" (UniqueName: \"kubernetes.io/projected/8da8babf-f15a-4f70-bf70-23218ca0628a-kube-api-access-6bgpt\") pod \"nova-cell1-novncproxy-0\" (UID: \"8da8babf-f15a-4f70-bf70-23218ca0628a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:34 crc kubenswrapper[4904]: I1205 20:35:34.970443 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:35 crc kubenswrapper[4904]: I1205 20:35:35.274574 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7774fc8c79-ms4p7"] Dec 05 20:35:35 crc kubenswrapper[4904]: W1205 20:35:35.281194 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda252ec50_71cf_4e22_b84a_8c247b695354.slice/crio-dc19ec09fbfd3ce1c468eae52471ba79dcb5f5ab12fdfb8c014905790306487c WatchSource:0}: Error finding container dc19ec09fbfd3ce1c468eae52471ba79dcb5f5ab12fdfb8c014905790306487c: Status 404 returned error can't find the container with id dc19ec09fbfd3ce1c468eae52471ba79dcb5f5ab12fdfb8c014905790306487c Dec 05 20:35:35 crc kubenswrapper[4904]: I1205 20:35:35.499913 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:35:35 crc kubenswrapper[4904]: I1205 20:35:35.706538 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea849d11-eeaa-47ec-8110-b796daf77157" path="/var/lib/kubelet/pods/ea849d11-eeaa-47ec-8110-b796daf77157/volumes" Dec 05 20:35:36 crc kubenswrapper[4904]: I1205 20:35:36.243856 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8da8babf-f15a-4f70-bf70-23218ca0628a","Type":"ContainerStarted","Data":"01e3bcb5f59cfee2efcdebc705df9c9f03b97d54341c20d0d245cf58138e99e2"} Dec 05 20:35:36 crc kubenswrapper[4904]: I1205 20:35:36.244147 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8da8babf-f15a-4f70-bf70-23218ca0628a","Type":"ContainerStarted","Data":"191712563b63c0cdd64b77a043751e575f61ab81b239927217f3dcb7c0c1d68a"} Dec 05 20:35:36 crc kubenswrapper[4904]: I1205 20:35:36.245915 4904 generic.go:334] "Generic (PLEG): container finished" podID="a252ec50-71cf-4e22-b84a-8c247b695354" containerID="621247310fcf03b400878d476b417299fd649660f3a7d2ab3202740dad2e9dc2" exitCode=0 Dec 05 20:35:36 crc kubenswrapper[4904]: I1205 20:35:36.245998 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" event={"ID":"a252ec50-71cf-4e22-b84a-8c247b695354","Type":"ContainerDied","Data":"621247310fcf03b400878d476b417299fd649660f3a7d2ab3202740dad2e9dc2"} Dec 05 20:35:36 crc kubenswrapper[4904]: I1205 20:35:36.246040 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" event={"ID":"a252ec50-71cf-4e22-b84a-8c247b695354","Type":"ContainerStarted","Data":"dc19ec09fbfd3ce1c468eae52471ba79dcb5f5ab12fdfb8c014905790306487c"} Dec 05 20:35:36 crc kubenswrapper[4904]: I1205 20:35:36.269515 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.269493538 podStartE2EDuration="2.269493538s" podCreationTimestamp="2025-12-05 20:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:36.266217422 +0000 UTC m=+1435.077433551" watchObservedRunningTime="2025-12-05 20:35:36.269493538 +0000 UTC m=+1435.080709657" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.218817 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.257656 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.257737 4904 generic.go:334] "Generic (PLEG): container finished" podID="63304fc4-9767-4efa-9002-47411daa12bd" containerID="631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145" exitCode=0 Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.257761 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerDied","Data":"631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145"} Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.258154 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63304fc4-9767-4efa-9002-47411daa12bd","Type":"ContainerDied","Data":"912b2b42c13047f3df2954502536ca7c3adc090216b1a37a1dd35ab189c4e399"} Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.258185 4904 scope.go:117] "RemoveContainer" containerID="c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.264153 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" event={"ID":"a252ec50-71cf-4e22-b84a-8c247b695354","Type":"ContainerStarted","Data":"447a5335e06188527c763f74532686b32d439ef0e380513c3b2d6af75e0bf8d1"} Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.264191 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.264331 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-log" containerID="cri-o://26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79" gracePeriod=30 Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.264447 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-api" containerID="cri-o://f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0" gracePeriod=30 Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.288039 4904 scope.go:117] "RemoveContainer" containerID="8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.338295 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" podStartSLOduration=3.338273406 podStartE2EDuration="3.338273406s" podCreationTimestamp="2025-12-05 20:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:37.302103021 +0000 UTC m=+1436.113319130" watchObservedRunningTime="2025-12-05 20:35:37.338273406 +0000 UTC m=+1436.149489515" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.357848 4904 scope.go:117] "RemoveContainer" containerID="631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.401012 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-sg-core-conf-yaml\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.401909 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-scripts\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.401939 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-config-data\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.402044 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-run-httpd\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.402118 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8z8f\" (UniqueName: \"kubernetes.io/projected/63304fc4-9767-4efa-9002-47411daa12bd-kube-api-access-b8z8f\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.402179 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-combined-ca-bundle\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.402237 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-log-httpd\") pod \"63304fc4-9767-4efa-9002-47411daa12bd\" (UID: \"63304fc4-9767-4efa-9002-47411daa12bd\") " Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.405092 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.405714 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.408286 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-scripts" (OuterVolumeSpecName: "scripts") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.408291 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63304fc4-9767-4efa-9002-47411daa12bd-kube-api-access-b8z8f" (OuterVolumeSpecName: "kube-api-access-b8z8f") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "kube-api-access-b8z8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.409015 4904 scope.go:117] "RemoveContainer" containerID="f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.456225 4904 scope.go:117] "RemoveContainer" containerID="c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a" Dec 05 20:35:37 crc kubenswrapper[4904]: E1205 20:35:37.456792 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a\": container with ID starting with c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a not found: ID does not exist" containerID="c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.456831 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a"} err="failed to get container status \"c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a\": rpc error: code = NotFound desc = could not find container \"c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a\": container with ID starting with c81738e2f532d2f5a2cce99108a71b291982e5029858bf10ee747ca334f4303a not found: ID does not exist" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.456922 4904 scope.go:117] "RemoveContainer" containerID="8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e" Dec 05 20:35:37 crc kubenswrapper[4904]: E1205 20:35:37.457216 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e\": container with ID starting with 8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e not found: ID does not exist" containerID="8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.457236 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e"} err="failed to get container status \"8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e\": rpc error: code = NotFound desc = could not find container \"8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e\": container with ID starting with 8406e73ff67ca3784af13fccc2f488ac9cbc5f0fcb50a9dcb2dc0c90545a7a9e not found: ID does not exist" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.457251 4904 scope.go:117] "RemoveContainer" containerID="631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145" Dec 05 20:35:37 crc kubenswrapper[4904]: E1205 20:35:37.457453 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145\": container with ID starting with 631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145 not found: ID does not exist" containerID="631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.457481 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145"} err="failed to get container status \"631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145\": rpc error: code = NotFound desc = could not find container \"631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145\": container with ID starting with 631d4e7aeb6d777251fa6228610a1bfc779dce8db0b8a71d7b9a455ef5aad145 not found: ID does not exist" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.457493 4904 scope.go:117] "RemoveContainer" containerID="f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e" Dec 05 20:35:37 crc kubenswrapper[4904]: E1205 20:35:37.457845 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e\": container with ID starting with f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e not found: ID does not exist" containerID="f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.457875 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e"} err="failed to get container status \"f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e\": rpc error: code = NotFound desc = could not find container \"f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e\": container with ID starting with f96a3e12ac921544aa396c27abc113e3e88a8d7eccf35601b6d15a751b7ef25e not found: ID does not exist" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.467301 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.486596 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.510003 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.510048 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.510069 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.510078 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.510086 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63304fc4-9767-4efa-9002-47411daa12bd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.510096 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8z8f\" (UniqueName: \"kubernetes.io/projected/63304fc4-9767-4efa-9002-47411daa12bd-kube-api-access-b8z8f\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.563932 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-config-data" (OuterVolumeSpecName: "config-data") pod "63304fc4-9767-4efa-9002-47411daa12bd" (UID: "63304fc4-9767-4efa-9002-47411daa12bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:37 crc kubenswrapper[4904]: I1205 20:35:37.611747 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63304fc4-9767-4efa-9002-47411daa12bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.275868 4904 generic.go:334] "Generic (PLEG): container finished" podID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerID="26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79" exitCode=143 Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.275969 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f","Type":"ContainerDied","Data":"26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79"} Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.278408 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.338962 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.360526 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.372975 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4904]: E1205 20:35:38.373539 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-notification-agent" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373556 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-notification-agent" Dec 05 20:35:38 crc kubenswrapper[4904]: E1205 20:35:38.373589 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="proxy-httpd" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373598 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="proxy-httpd" Dec 05 20:35:38 crc kubenswrapper[4904]: E1205 20:35:38.373616 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="sg-core" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373624 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="sg-core" Dec 05 20:35:38 crc kubenswrapper[4904]: E1205 20:35:38.373651 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-central-agent" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373660 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-central-agent" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373906 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="sg-core" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373936 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-notification-agent" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373957 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="proxy-httpd" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.373970 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="63304fc4-9767-4efa-9002-47411daa12bd" containerName="ceilometer-central-agent" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.376386 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.393411 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.393662 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.394953 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.397938 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529077 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529128 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529156 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-log-httpd\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529254 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-config-data\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529366 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529483 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-run-httpd\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529566 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qgn\" (UniqueName: \"kubernetes.io/projected/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-kube-api-access-h9qgn\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.529642 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-scripts\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.631813 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-run-httpd\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.631878 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qgn\" (UniqueName: \"kubernetes.io/projected/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-kube-api-access-h9qgn\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.631914 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-scripts\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.631977 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.632007 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.632035 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-log-httpd\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.632050 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-config-data\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.632108 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.632554 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-run-httpd\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.632918 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-log-httpd\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.638331 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.638928 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.639412 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-config-data\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.641443 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-scripts\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.643098 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.652457 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qgn\" (UniqueName: \"kubernetes.io/projected/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-kube-api-access-h9qgn\") pod \"ceilometer-0\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.765167 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4904]: I1205 20:35:38.900666 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.038658 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-combined-ca-bundle\") pod \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.039390 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-logs\") pod \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.039469 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkrbz\" (UniqueName: \"kubernetes.io/projected/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-kube-api-access-jkrbz\") pod \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.039578 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-config-data\") pod \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\" (UID: \"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f\") " Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.040730 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-logs" (OuterVolumeSpecName: "logs") pod "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" (UID: "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.045496 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-kube-api-access-jkrbz" (OuterVolumeSpecName: "kube-api-access-jkrbz") pod "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" (UID: "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f"). InnerVolumeSpecName "kube-api-access-jkrbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.073355 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" (UID: "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.077849 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-config-data" (OuterVolumeSpecName: "config-data") pod "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" (UID: "d7f7b746-1f4b-4cd0-83aa-501b0c187c0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.142366 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.142396 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkrbz\" (UniqueName: \"kubernetes.io/projected/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-kube-api-access-jkrbz\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.142407 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.142416 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.241894 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.288366 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerStarted","Data":"8f56a0ff7678cdf2735a89c41fa7159126010756ea988b2eb3c59ccab8820956"} Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.291333 4904 generic.go:334] "Generic (PLEG): container finished" podID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerID="f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0" exitCode=0 Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.291403 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.291411 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f","Type":"ContainerDied","Data":"f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0"} Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.291461 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7f7b746-1f4b-4cd0-83aa-501b0c187c0f","Type":"ContainerDied","Data":"404229fce7fd5d06999ed94bf8b4cbdf9af32534a8ddba544e275b512724a0ad"} Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.291480 4904 scope.go:117] "RemoveContainer" containerID="f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.331475 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.351016 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.354156 4904 scope.go:117] "RemoveContainer" containerID="26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.378916 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.392040 4904 scope.go:117] "RemoveContainer" containerID="f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0" Dec 05 20:35:39 crc kubenswrapper[4904]: E1205 20:35:39.393294 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0\": container with ID starting with f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0 not found: ID does not exist" containerID="f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.393338 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0"} err="failed to get container status \"f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0\": rpc error: code = NotFound desc = could not find container \"f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0\": container with ID starting with f8a82556a70c4cad4fdc4319693ce3a2556cbb7c04202d9a8851572ed5fde6d0 not found: ID does not exist" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.393365 4904 scope.go:117] "RemoveContainer" containerID="26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79" Dec 05 20:35:39 crc kubenswrapper[4904]: E1205 20:35:39.394252 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79\": container with ID starting with 26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79 not found: ID does not exist" containerID="26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.394352 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79"} err="failed to get container status \"26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79\": rpc error: code = NotFound desc = could not find container \"26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79\": container with ID starting with 26da1ace0721a44d16a24ce375ed0c83d57c566aaae84e2a53b0ec2d3785ea79 not found: ID does not exist" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.400665 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:39 crc kubenswrapper[4904]: E1205 20:35:39.401621 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-log" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.401705 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-log" Dec 05 20:35:39 crc kubenswrapper[4904]: E1205 20:35:39.401789 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-api" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.401845 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-api" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.402093 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-api" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.402172 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" containerName="nova-api-log" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.403388 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.419473 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.432229 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.432366 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.432520 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.554438 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-config-data\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.554744 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.554834 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znz47\" (UniqueName: \"kubernetes.io/projected/0047c990-60cf-4de3-9fe2-6b585c11a8fd-kube-api-access-znz47\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.554857 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.555016 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0047c990-60cf-4de3-9fe2-6b585c11a8fd-logs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.555236 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.656888 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.657196 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znz47\" (UniqueName: \"kubernetes.io/projected/0047c990-60cf-4de3-9fe2-6b585c11a8fd-kube-api-access-znz47\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.657243 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.657288 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0047c990-60cf-4de3-9fe2-6b585c11a8fd-logs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.657345 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.657511 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-config-data\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.658633 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0047c990-60cf-4de3-9fe2-6b585c11a8fd-logs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.662268 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.662372 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.662848 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-config-data\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.663447 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.677623 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znz47\" (UniqueName: \"kubernetes.io/projected/0047c990-60cf-4de3-9fe2-6b585c11a8fd-kube-api-access-znz47\") pod \"nova-api-0\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.694336 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63304fc4-9767-4efa-9002-47411daa12bd" path="/var/lib/kubelet/pods/63304fc4-9767-4efa-9002-47411daa12bd/volumes" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.695133 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f7b746-1f4b-4cd0-83aa-501b0c187c0f" path="/var/lib/kubelet/pods/d7f7b746-1f4b-4cd0-83aa-501b0c187c0f/volumes" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.767185 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:39 crc kubenswrapper[4904]: I1205 20:35:39.970783 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:40 crc kubenswrapper[4904]: W1205 20:35:40.298984 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0047c990_60cf_4de3_9fe2_6b585c11a8fd.slice/crio-a09d794d0ffc49e6f1c85a42e9ada0460978515778170593535995eb62a1be6e WatchSource:0}: Error finding container a09d794d0ffc49e6f1c85a42e9ada0460978515778170593535995eb62a1be6e: Status 404 returned error can't find the container with id a09d794d0ffc49e6f1c85a42e9ada0460978515778170593535995eb62a1be6e Dec 05 20:35:40 crc kubenswrapper[4904]: I1205 20:35:40.302752 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerStarted","Data":"5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278"} Dec 05 20:35:40 crc kubenswrapper[4904]: I1205 20:35:40.302787 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerStarted","Data":"d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449"} Dec 05 20:35:40 crc kubenswrapper[4904]: I1205 20:35:40.310266 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:41 crc kubenswrapper[4904]: I1205 20:35:41.317996 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0047c990-60cf-4de3-9fe2-6b585c11a8fd","Type":"ContainerStarted","Data":"816cc2c454acb008ef7821bf8531efc900eace41cb6fb0a4c6c07e82ab83a092"} Dec 05 20:35:41 crc kubenswrapper[4904]: I1205 20:35:41.318389 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0047c990-60cf-4de3-9fe2-6b585c11a8fd","Type":"ContainerStarted","Data":"1fa5a105379034524408ff154eecfa631b71bff9f5f1a304031966639ce24b94"} Dec 05 20:35:41 crc kubenswrapper[4904]: I1205 20:35:41.318427 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0047c990-60cf-4de3-9fe2-6b585c11a8fd","Type":"ContainerStarted","Data":"a09d794d0ffc49e6f1c85a42e9ada0460978515778170593535995eb62a1be6e"} Dec 05 20:35:41 crc kubenswrapper[4904]: I1205 20:35:41.329321 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerStarted","Data":"5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f"} Dec 05 20:35:41 crc kubenswrapper[4904]: I1205 20:35:41.345879 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.345827422 podStartE2EDuration="2.345827422s" podCreationTimestamp="2025-12-05 20:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:41.342504483 +0000 UTC m=+1440.153720632" watchObservedRunningTime="2025-12-05 20:35:41.345827422 +0000 UTC m=+1440.157043531" Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.342515 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-central-agent" containerID="cri-o://d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449" gracePeriod=30 Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.343529 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-notification-agent" containerID="cri-o://5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278" gracePeriod=30 Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.343521 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="sg-core" containerID="cri-o://5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f" gracePeriod=30 Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.343666 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerStarted","Data":"70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32"} Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.343736 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.343817 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="proxy-httpd" containerID="cri-o://70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32" gracePeriod=30 Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.369938 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.748040056 podStartE2EDuration="4.369912924s" podCreationTimestamp="2025-12-05 20:35:38 +0000 UTC" firstStartedPulling="2025-12-05 20:35:39.245887426 +0000 UTC m=+1438.057103535" lastFinishedPulling="2025-12-05 20:35:41.867760294 +0000 UTC m=+1440.678976403" observedRunningTime="2025-12-05 20:35:42.364379245 +0000 UTC m=+1441.175595384" watchObservedRunningTime="2025-12-05 20:35:42.369912924 +0000 UTC m=+1441.181129033" Dec 05 20:35:42 crc kubenswrapper[4904]: I1205 20:35:42.595198 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 20:35:43 crc kubenswrapper[4904]: I1205 20:35:43.360287 4904 generic.go:334] "Generic (PLEG): container finished" podID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerID="70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32" exitCode=0 Dec 05 20:35:43 crc kubenswrapper[4904]: I1205 20:35:43.360325 4904 generic.go:334] "Generic (PLEG): container finished" podID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerID="5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f" exitCode=2 Dec 05 20:35:43 crc kubenswrapper[4904]: I1205 20:35:43.360335 4904 generic.go:334] "Generic (PLEG): container finished" podID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerID="5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278" exitCode=0 Dec 05 20:35:43 crc kubenswrapper[4904]: I1205 20:35:43.360361 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerDied","Data":"70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32"} Dec 05 20:35:43 crc kubenswrapper[4904]: I1205 20:35:43.360409 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerDied","Data":"5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f"} Dec 05 20:35:43 crc kubenswrapper[4904]: I1205 20:35:43.360422 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerDied","Data":"5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278"} Dec 05 20:35:44 crc kubenswrapper[4904]: I1205 20:35:44.749836 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:35:44 crc kubenswrapper[4904]: I1205 20:35:44.823888 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bd8c6ff9-v2kck"] Dec 05 20:35:44 crc kubenswrapper[4904]: I1205 20:35:44.824413 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerName="dnsmasq-dns" containerID="cri-o://e20b1c5635af73ad3f022b2250a6d02a0f2beaac421118e9022ea3e53ce02e30" gracePeriod=10 Dec 05 20:35:44 crc kubenswrapper[4904]: I1205 20:35:44.971288 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.001431 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.381543 4904 generic.go:334] "Generic (PLEG): container finished" podID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerID="e20b1c5635af73ad3f022b2250a6d02a0f2beaac421118e9022ea3e53ce02e30" exitCode=0 Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.381744 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" event={"ID":"81d2dd5d-de80-4136-879d-faf2a6c4af16","Type":"ContainerDied","Data":"e20b1c5635af73ad3f022b2250a6d02a0f2beaac421118e9022ea3e53ce02e30"} Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.381804 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" event={"ID":"81d2dd5d-de80-4136-879d-faf2a6c4af16","Type":"ContainerDied","Data":"2b284fc5bb19faa329706fe95acc7d4785a7475229d6aa64c76241267caa1608"} Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.381829 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b284fc5bb19faa329706fe95acc7d4785a7475229d6aa64c76241267caa1608" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.399907 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.519670 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.587824 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-config\") pod \"81d2dd5d-de80-4136-879d-faf2a6c4af16\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.587975 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-nb\") pod \"81d2dd5d-de80-4136-879d-faf2a6c4af16\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.588021 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-swift-storage-0\") pod \"81d2dd5d-de80-4136-879d-faf2a6c4af16\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.588147 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-svc\") pod \"81d2dd5d-de80-4136-879d-faf2a6c4af16\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.588197 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mwgz\" (UniqueName: \"kubernetes.io/projected/81d2dd5d-de80-4136-879d-faf2a6c4af16-kube-api-access-7mwgz\") pod \"81d2dd5d-de80-4136-879d-faf2a6c4af16\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.588254 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-sb\") pod \"81d2dd5d-de80-4136-879d-faf2a6c4af16\" (UID: \"81d2dd5d-de80-4136-879d-faf2a6c4af16\") " Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.607299 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d2dd5d-de80-4136-879d-faf2a6c4af16-kube-api-access-7mwgz" (OuterVolumeSpecName: "kube-api-access-7mwgz") pod "81d2dd5d-de80-4136-879d-faf2a6c4af16" (UID: "81d2dd5d-de80-4136-879d-faf2a6c4af16"). InnerVolumeSpecName "kube-api-access-7mwgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.641935 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81d2dd5d-de80-4136-879d-faf2a6c4af16" (UID: "81d2dd5d-de80-4136-879d-faf2a6c4af16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.648231 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-config" (OuterVolumeSpecName: "config") pod "81d2dd5d-de80-4136-879d-faf2a6c4af16" (UID: "81d2dd5d-de80-4136-879d-faf2a6c4af16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.649626 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81d2dd5d-de80-4136-879d-faf2a6c4af16" (UID: "81d2dd5d-de80-4136-879d-faf2a6c4af16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.651823 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81d2dd5d-de80-4136-879d-faf2a6c4af16" (UID: "81d2dd5d-de80-4136-879d-faf2a6c4af16"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.680435 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81d2dd5d-de80-4136-879d-faf2a6c4af16" (UID: "81d2dd5d-de80-4136-879d-faf2a6c4af16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.692934 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.692960 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.692970 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.692979 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.693002 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mwgz\" (UniqueName: \"kubernetes.io/projected/81d2dd5d-de80-4136-879d-faf2a6c4af16-kube-api-access-7mwgz\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.693012 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81d2dd5d-de80-4136-879d-faf2a6c4af16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.707155 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hpprd"] Dec 05 20:35:45 crc kubenswrapper[4904]: E1205 20:35:45.707600 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerName="dnsmasq-dns" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.707616 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerName="dnsmasq-dns" Dec 05 20:35:45 crc kubenswrapper[4904]: E1205 20:35:45.707628 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerName="init" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.707634 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerName="init" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.707837 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" containerName="dnsmasq-dns" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.708512 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.727236 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.729308 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.735753 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hpprd"] Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.794387 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-config-data\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.796815 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2h8\" (UniqueName: \"kubernetes.io/projected/28682f8e-fc47-4366-bfdf-14c91d0d0aba-kube-api-access-rc2h8\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.796956 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.798006 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-scripts\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: E1205 20:35:45.857160 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d2dd5d_de80_4136_879d_faf2a6c4af16.slice/crio-2b284fc5bb19faa329706fe95acc7d4785a7475229d6aa64c76241267caa1608\": RecentStats: unable to find data in memory cache]" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.900946 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2h8\" (UniqueName: \"kubernetes.io/projected/28682f8e-fc47-4366-bfdf-14c91d0d0aba-kube-api-access-rc2h8\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.901280 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.901334 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-scripts\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.901418 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-config-data\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.904890 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-config-data\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.905445 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-scripts\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.905738 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:45 crc kubenswrapper[4904]: I1205 20:35:45.918233 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2h8\" (UniqueName: \"kubernetes.io/projected/28682f8e-fc47-4366-bfdf-14c91d0d0aba-kube-api-access-rc2h8\") pod \"nova-cell1-cell-mapping-hpprd\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:46 crc kubenswrapper[4904]: I1205 20:35:46.052440 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:46 crc kubenswrapper[4904]: I1205 20:35:46.389572 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd8c6ff9-v2kck" Dec 05 20:35:46 crc kubenswrapper[4904]: I1205 20:35:46.415994 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bd8c6ff9-v2kck"] Dec 05 20:35:46 crc kubenswrapper[4904]: I1205 20:35:46.428817 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bd8c6ff9-v2kck"] Dec 05 20:35:46 crc kubenswrapper[4904]: I1205 20:35:46.539278 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hpprd"] Dec 05 20:35:46 crc kubenswrapper[4904]: W1205 20:35:46.541503 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28682f8e_fc47_4366_bfdf_14c91d0d0aba.slice/crio-793145650f649c5acc036bfdf28893d481361c0afbfd5c534733945069cf5fa8 WatchSource:0}: Error finding container 793145650f649c5acc036bfdf28893d481361c0afbfd5c534733945069cf5fa8: Status 404 returned error can't find the container with id 793145650f649c5acc036bfdf28893d481361c0afbfd5c534733945069cf5fa8 Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.217066 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.360133 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-config-data\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.360236 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-combined-ca-bundle\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.360300 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-sg-core-conf-yaml\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.361093 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-ceilometer-tls-certs\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.361170 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-scripts\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.361204 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-run-httpd\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.361270 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-log-httpd\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.361298 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9qgn\" (UniqueName: \"kubernetes.io/projected/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-kube-api-access-h9qgn\") pod \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\" (UID: \"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9\") " Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.361874 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.362088 4904 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.362430 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.403690 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hpprd" event={"ID":"28682f8e-fc47-4366-bfdf-14c91d0d0aba","Type":"ContainerStarted","Data":"d84bae1da041062aef96a208ed4ed936e8c3cfbeeb01e5611c8be7d5c3c7dbe9"} Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.403747 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hpprd" event={"ID":"28682f8e-fc47-4366-bfdf-14c91d0d0aba","Type":"ContainerStarted","Data":"793145650f649c5acc036bfdf28893d481361c0afbfd5c534733945069cf5fa8"} Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.408662 4904 generic.go:334] "Generic (PLEG): container finished" podID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerID="d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449" exitCode=0 Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.408705 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerDied","Data":"d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449"} Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.408731 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39c8fa52-7f81-4b73-b02a-d59ecda9d8e9","Type":"ContainerDied","Data":"8f56a0ff7678cdf2735a89c41fa7159126010756ea988b2eb3c59ccab8820956"} Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.408734 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.408762 4904 scope.go:117] "RemoveContainer" containerID="70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.425355 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hpprd" podStartSLOduration=2.425333047 podStartE2EDuration="2.425333047s" podCreationTimestamp="2025-12-05 20:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:47.417678697 +0000 UTC m=+1446.228894816" watchObservedRunningTime="2025-12-05 20:35:47.425333047 +0000 UTC m=+1446.236549156" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.463958 4904 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.481400 4904 scope.go:117] "RemoveContainer" containerID="5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.503160 4904 scope.go:117] "RemoveContainer" containerID="5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.522667 4904 scope.go:117] "RemoveContainer" containerID="d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.539427 4904 scope.go:117] "RemoveContainer" containerID="70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32" Dec 05 20:35:47 crc kubenswrapper[4904]: E1205 20:35:47.546569 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32\": container with ID starting with 70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32 not found: ID does not exist" containerID="70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.546625 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32"} err="failed to get container status \"70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32\": rpc error: code = NotFound desc = could not find container \"70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32\": container with ID starting with 70714c080c5aaea1d101958ab1c5058c071309aa10aa9ee006af19c8b5885f32 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.546659 4904 scope.go:117] "RemoveContainer" containerID="5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f" Dec 05 20:35:47 crc kubenswrapper[4904]: E1205 20:35:47.546984 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f\": container with ID starting with 5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f not found: ID does not exist" containerID="5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.547014 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f"} err="failed to get container status \"5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f\": rpc error: code = NotFound desc = could not find container \"5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f\": container with ID starting with 5afd5223589741e8a36a7b2ea4202dc48b666e31a1039b6a1bf1417d2624261f not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.547034 4904 scope.go:117] "RemoveContainer" containerID="5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278" Dec 05 20:35:47 crc kubenswrapper[4904]: E1205 20:35:47.547287 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278\": container with ID starting with 5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278 not found: ID does not exist" containerID="5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.547325 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278"} err="failed to get container status \"5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278\": rpc error: code = NotFound desc = could not find container \"5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278\": container with ID starting with 5cf2c4a284ee12daa5d3ac20ddb0e53eda6aec8f29a93fce344abd97b71ba278 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.547351 4904 scope.go:117] "RemoveContainer" containerID="d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449" Dec 05 20:35:47 crc kubenswrapper[4904]: E1205 20:35:47.547715 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449\": container with ID starting with d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449 not found: ID does not exist" containerID="d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.547743 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449"} err="failed to get container status \"d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449\": rpc error: code = NotFound desc = could not find container \"d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449\": container with ID starting with d4be71109404fa0cd31a29c88e29839e750d18539842b657f83361bb4a752449 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4904]: I1205 20:35:47.693641 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d2dd5d-de80-4136-879d-faf2a6c4af16" path="/var/lib/kubelet/pods/81d2dd5d-de80-4136-879d-faf2a6c4af16/volumes" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.096010 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-scripts" (OuterVolumeSpecName: "scripts") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.096034 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-kube-api-access-h9qgn" (OuterVolumeSpecName: "kube-api-access-h9qgn") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "kube-api-access-h9qgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.146445 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.176795 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.184175 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.184205 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9qgn\" (UniqueName: \"kubernetes.io/projected/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-kube-api-access-h9qgn\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.184216 4904 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.184224 4904 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.185054 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.229745 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-config-data" (OuterVolumeSpecName: "config-data") pod "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" (UID: "39c8fa52-7f81-4b73-b02a-d59ecda9d8e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.287810 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.287849 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.380427 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.404236 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.412800 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:48 crc kubenswrapper[4904]: E1205 20:35:48.413327 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="proxy-httpd" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413349 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="proxy-httpd" Dec 05 20:35:48 crc kubenswrapper[4904]: E1205 20:35:48.413364 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-central-agent" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413370 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-central-agent" Dec 05 20:35:48 crc kubenswrapper[4904]: E1205 20:35:48.413400 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-notification-agent" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413406 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-notification-agent" Dec 05 20:35:48 crc kubenswrapper[4904]: E1205 20:35:48.413423 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="sg-core" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413428 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="sg-core" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413615 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-notification-agent" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413636 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="ceilometer-central-agent" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413647 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="proxy-httpd" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.413657 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" containerName="sg-core" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.415873 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.418085 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.418915 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.419741 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.422695 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.490977 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.491050 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-config-data\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.491577 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.491720 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.491804 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fc4fd93-36e9-448b-88ec-b4c4227c941c-log-httpd\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.491939 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-scripts\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.492037 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fc4fd93-36e9-448b-88ec-b4c4227c941c-run-httpd\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.492177 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzz4\" (UniqueName: \"kubernetes.io/projected/6fc4fd93-36e9-448b-88ec-b4c4227c941c-kube-api-access-ktzz4\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594502 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594558 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fc4fd93-36e9-448b-88ec-b4c4227c941c-log-httpd\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594588 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594621 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-scripts\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594637 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fc4fd93-36e9-448b-88ec-b4c4227c941c-run-httpd\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594671 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzz4\" (UniqueName: \"kubernetes.io/projected/6fc4fd93-36e9-448b-88ec-b4c4227c941c-kube-api-access-ktzz4\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594700 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.594748 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-config-data\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.596402 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fc4fd93-36e9-448b-88ec-b4c4227c941c-log-httpd\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.596736 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fc4fd93-36e9-448b-88ec-b4c4227c941c-run-httpd\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.599212 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-config-data\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.599465 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-scripts\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.600033 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.600387 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.602033 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fc4fd93-36e9-448b-88ec-b4c4227c941c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.616574 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzz4\" (UniqueName: \"kubernetes.io/projected/6fc4fd93-36e9-448b-88ec-b4c4227c941c-kube-api-access-ktzz4\") pod \"ceilometer-0\" (UID: \"6fc4fd93-36e9-448b-88ec-b4c4227c941c\") " pod="openstack/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4904]: I1205 20:35:48.748686 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:35:49 crc kubenswrapper[4904]: I1205 20:35:49.215408 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:35:49 crc kubenswrapper[4904]: W1205 20:35:49.217460 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fc4fd93_36e9_448b_88ec_b4c4227c941c.slice/crio-5fbaf68bdda6a16efdcfe6cfc4574c48271f304b8ee3a385bcaff1609f0b7676 WatchSource:0}: Error finding container 5fbaf68bdda6a16efdcfe6cfc4574c48271f304b8ee3a385bcaff1609f0b7676: Status 404 returned error can't find the container with id 5fbaf68bdda6a16efdcfe6cfc4574c48271f304b8ee3a385bcaff1609f0b7676 Dec 05 20:35:49 crc kubenswrapper[4904]: I1205 20:35:49.442919 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fc4fd93-36e9-448b-88ec-b4c4227c941c","Type":"ContainerStarted","Data":"5fbaf68bdda6a16efdcfe6cfc4574c48271f304b8ee3a385bcaff1609f0b7676"} Dec 05 20:35:49 crc kubenswrapper[4904]: I1205 20:35:49.696484 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c8fa52-7f81-4b73-b02a-d59ecda9d8e9" path="/var/lib/kubelet/pods/39c8fa52-7f81-4b73-b02a-d59ecda9d8e9/volumes" Dec 05 20:35:49 crc kubenswrapper[4904]: I1205 20:35:49.768338 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:35:49 crc kubenswrapper[4904]: I1205 20:35:49.768414 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:35:50 crc kubenswrapper[4904]: I1205 20:35:50.463195 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fc4fd93-36e9-448b-88ec-b4c4227c941c","Type":"ContainerStarted","Data":"815a9cf336eeaefcfec1aaac823bf60dfc8daf30d07a7ad394b6a64fef366998"} Dec 05 20:35:50 crc kubenswrapper[4904]: I1205 20:35:50.463490 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fc4fd93-36e9-448b-88ec-b4c4227c941c","Type":"ContainerStarted","Data":"c5306ed47987b0198e93ff0c5143f16e38d92ef975eef9a93445ec568fb12782"} Dec 05 20:35:50 crc kubenswrapper[4904]: I1205 20:35:50.775635 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:50 crc kubenswrapper[4904]: I1205 20:35:50.775678 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:35:51 crc kubenswrapper[4904]: I1205 20:35:51.476170 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fc4fd93-36e9-448b-88ec-b4c4227c941c","Type":"ContainerStarted","Data":"f44db35d46628ad57da2609fc82ae2be35c4b8b5954a6d273166629c2de81150"} Dec 05 20:35:52 crc kubenswrapper[4904]: I1205 20:35:52.495908 4904 generic.go:334] "Generic (PLEG): container finished" podID="28682f8e-fc47-4366-bfdf-14c91d0d0aba" containerID="d84bae1da041062aef96a208ed4ed936e8c3cfbeeb01e5611c8be7d5c3c7dbe9" exitCode=0 Dec 05 20:35:52 crc kubenswrapper[4904]: I1205 20:35:52.496047 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hpprd" event={"ID":"28682f8e-fc47-4366-bfdf-14c91d0d0aba","Type":"ContainerDied","Data":"d84bae1da041062aef96a208ed4ed936e8c3cfbeeb01e5611c8be7d5c3c7dbe9"} Dec 05 20:35:52 crc kubenswrapper[4904]: I1205 20:35:52.518407 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fc4fd93-36e9-448b-88ec-b4c4227c941c","Type":"ContainerStarted","Data":"f19e89815d97c379250b52934b881c3c62e9e2c71b393681f7e2b4be1a6a7025"} Dec 05 20:35:52 crc kubenswrapper[4904]: I1205 20:35:52.518790 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:35:52 crc kubenswrapper[4904]: I1205 20:35:52.573215 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.244260082 podStartE2EDuration="4.57319565s" podCreationTimestamp="2025-12-05 20:35:48 +0000 UTC" firstStartedPulling="2025-12-05 20:35:49.220164946 +0000 UTC m=+1448.031381065" lastFinishedPulling="2025-12-05 20:35:51.549100524 +0000 UTC m=+1450.360316633" observedRunningTime="2025-12-05 20:35:52.565694325 +0000 UTC m=+1451.376910524" watchObservedRunningTime="2025-12-05 20:35:52.57319565 +0000 UTC m=+1451.384411759" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.416133 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.542213 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hpprd" event={"ID":"28682f8e-fc47-4366-bfdf-14c91d0d0aba","Type":"ContainerDied","Data":"793145650f649c5acc036bfdf28893d481361c0afbfd5c534733945069cf5fa8"} Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.542248 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="793145650f649c5acc036bfdf28893d481361c0afbfd5c534733945069cf5fa8" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.542291 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hpprd" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.588297 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-config-data\") pod \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.588555 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc2h8\" (UniqueName: \"kubernetes.io/projected/28682f8e-fc47-4366-bfdf-14c91d0d0aba-kube-api-access-rc2h8\") pod \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.588595 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-combined-ca-bundle\") pod \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.588617 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-scripts\") pod \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\" (UID: \"28682f8e-fc47-4366-bfdf-14c91d0d0aba\") " Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.593995 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-scripts" (OuterVolumeSpecName: "scripts") pod "28682f8e-fc47-4366-bfdf-14c91d0d0aba" (UID: "28682f8e-fc47-4366-bfdf-14c91d0d0aba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.607615 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28682f8e-fc47-4366-bfdf-14c91d0d0aba-kube-api-access-rc2h8" (OuterVolumeSpecName: "kube-api-access-rc2h8") pod "28682f8e-fc47-4366-bfdf-14c91d0d0aba" (UID: "28682f8e-fc47-4366-bfdf-14c91d0d0aba"). InnerVolumeSpecName "kube-api-access-rc2h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.619280 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-config-data" (OuterVolumeSpecName: "config-data") pod "28682f8e-fc47-4366-bfdf-14c91d0d0aba" (UID: "28682f8e-fc47-4366-bfdf-14c91d0d0aba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.620443 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28682f8e-fc47-4366-bfdf-14c91d0d0aba" (UID: "28682f8e-fc47-4366-bfdf-14c91d0d0aba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.706117 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc2h8\" (UniqueName: \"kubernetes.io/projected/28682f8e-fc47-4366-bfdf-14c91d0d0aba-kube-api-access-rc2h8\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.706180 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.706192 4904 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.706203 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28682f8e-fc47-4366-bfdf-14c91d0d0aba-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.720098 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.720345 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-log" containerID="cri-o://1fa5a105379034524408ff154eecfa631b71bff9f5f1a304031966639ce24b94" gracePeriod=30 Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.720829 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-api" containerID="cri-o://816cc2c454acb008ef7821bf8531efc900eace41cb6fb0a4c6c07e82ab83a092" gracePeriod=30 Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.745447 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.745643 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1af764d9-33ef-4e99-8757-db2cd6705952" containerName="nova-scheduler-scheduler" containerID="cri-o://852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93" gracePeriod=30 Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.769115 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.770582 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-log" containerID="cri-o://dc7bbc8220797ffa8faf28635768dc41fddf0aac4d2a1edd6a35fa1247ad6296" gracePeriod=30 Dec 05 20:35:54 crc kubenswrapper[4904]: I1205 20:35:54.770656 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-metadata" containerID="cri-o://a7ee02175d8dd9b1ac2fd3b614fb0f3602487d1388e3a6b141f8a670b12429cd" gracePeriod=30 Dec 05 20:35:55 crc kubenswrapper[4904]: I1205 20:35:55.578711 4904 generic.go:334] "Generic (PLEG): container finished" podID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerID="dc7bbc8220797ffa8faf28635768dc41fddf0aac4d2a1edd6a35fa1247ad6296" exitCode=143 Dec 05 20:35:55 crc kubenswrapper[4904]: I1205 20:35:55.579098 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c381934-7d5f-429f-bf7c-294241e82ae3","Type":"ContainerDied","Data":"dc7bbc8220797ffa8faf28635768dc41fddf0aac4d2a1edd6a35fa1247ad6296"} Dec 05 20:35:55 crc kubenswrapper[4904]: I1205 20:35:55.581047 4904 generic.go:334] "Generic (PLEG): container finished" podID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerID="1fa5a105379034524408ff154eecfa631b71bff9f5f1a304031966639ce24b94" exitCode=143 Dec 05 20:35:55 crc kubenswrapper[4904]: I1205 20:35:55.581100 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0047c990-60cf-4de3-9fe2-6b585c11a8fd","Type":"ContainerDied","Data":"1fa5a105379034524408ff154eecfa631b71bff9f5f1a304031966639ce24b94"} Dec 05 20:35:56 crc kubenswrapper[4904]: I1205 20:35:56.488889 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": dial tcp 10.217.0.212:8775: connect: connection refused" Dec 05 20:35:56 crc kubenswrapper[4904]: I1205 20:35:56.488910 4904 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": dial tcp 10.217.0.212:8775: connect: connection refused" Dec 05 20:35:56 crc kubenswrapper[4904]: I1205 20:35:56.595302 4904 generic.go:334] "Generic (PLEG): container finished" podID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerID="816cc2c454acb008ef7821bf8531efc900eace41cb6fb0a4c6c07e82ab83a092" exitCode=0 Dec 05 20:35:56 crc kubenswrapper[4904]: I1205 20:35:56.595437 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0047c990-60cf-4de3-9fe2-6b585c11a8fd","Type":"ContainerDied","Data":"816cc2c454acb008ef7821bf8531efc900eace41cb6fb0a4c6c07e82ab83a092"} Dec 05 20:35:56 crc kubenswrapper[4904]: I1205 20:35:56.603361 4904 generic.go:334] "Generic (PLEG): container finished" podID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerID="a7ee02175d8dd9b1ac2fd3b614fb0f3602487d1388e3a6b141f8a670b12429cd" exitCode=0 Dec 05 20:35:56 crc kubenswrapper[4904]: I1205 20:35:56.603414 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c381934-7d5f-429f-bf7c-294241e82ae3","Type":"ContainerDied","Data":"a7ee02175d8dd9b1ac2fd3b614fb0f3602487d1388e3a6b141f8a670b12429cd"} Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.051136 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.058372 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189247 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-config-data\") pod \"6c381934-7d5f-429f-bf7c-294241e82ae3\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189286 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c381934-7d5f-429f-bf7c-294241e82ae3-logs\") pod \"6c381934-7d5f-429f-bf7c-294241e82ae3\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189304 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-combined-ca-bundle\") pod \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189321 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq96w\" (UniqueName: \"kubernetes.io/projected/6c381934-7d5f-429f-bf7c-294241e82ae3-kube-api-access-nq96w\") pod \"6c381934-7d5f-429f-bf7c-294241e82ae3\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189403 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-config-data\") pod \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189426 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-internal-tls-certs\") pod \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189442 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-combined-ca-bundle\") pod \"6c381934-7d5f-429f-bf7c-294241e82ae3\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189499 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-nova-metadata-tls-certs\") pod \"6c381934-7d5f-429f-bf7c-294241e82ae3\" (UID: \"6c381934-7d5f-429f-bf7c-294241e82ae3\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189613 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-public-tls-certs\") pod \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189648 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0047c990-60cf-4de3-9fe2-6b585c11a8fd-logs\") pod \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.189672 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znz47\" (UniqueName: \"kubernetes.io/projected/0047c990-60cf-4de3-9fe2-6b585c11a8fd-kube-api-access-znz47\") pod \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\" (UID: \"0047c990-60cf-4de3-9fe2-6b585c11a8fd\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.190279 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c381934-7d5f-429f-bf7c-294241e82ae3-logs" (OuterVolumeSpecName: "logs") pod "6c381934-7d5f-429f-bf7c-294241e82ae3" (UID: "6c381934-7d5f-429f-bf7c-294241e82ae3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.190732 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0047c990-60cf-4de3-9fe2-6b585c11a8fd-logs" (OuterVolumeSpecName: "logs") pod "0047c990-60cf-4de3-9fe2-6b585c11a8fd" (UID: "0047c990-60cf-4de3-9fe2-6b585c11a8fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.191044 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0047c990-60cf-4de3-9fe2-6b585c11a8fd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.191195 4904 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c381934-7d5f-429f-bf7c-294241e82ae3-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.195020 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c381934-7d5f-429f-bf7c-294241e82ae3-kube-api-access-nq96w" (OuterVolumeSpecName: "kube-api-access-nq96w") pod "6c381934-7d5f-429f-bf7c-294241e82ae3" (UID: "6c381934-7d5f-429f-bf7c-294241e82ae3"). InnerVolumeSpecName "kube-api-access-nq96w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.201745 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0047c990-60cf-4de3-9fe2-6b585c11a8fd-kube-api-access-znz47" (OuterVolumeSpecName: "kube-api-access-znz47") pod "0047c990-60cf-4de3-9fe2-6b585c11a8fd" (UID: "0047c990-60cf-4de3-9fe2-6b585c11a8fd"). InnerVolumeSpecName "kube-api-access-znz47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.234483 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-config-data" (OuterVolumeSpecName: "config-data") pod "0047c990-60cf-4de3-9fe2-6b585c11a8fd" (UID: "0047c990-60cf-4de3-9fe2-6b585c11a8fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.241628 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-config-data" (OuterVolumeSpecName: "config-data") pod "6c381934-7d5f-429f-bf7c-294241e82ae3" (UID: "6c381934-7d5f-429f-bf7c-294241e82ae3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.261330 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0047c990-60cf-4de3-9fe2-6b585c11a8fd" (UID: "0047c990-60cf-4de3-9fe2-6b585c11a8fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.261354 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c381934-7d5f-429f-bf7c-294241e82ae3" (UID: "6c381934-7d5f-429f-bf7c-294241e82ae3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.277331 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6c381934-7d5f-429f-bf7c-294241e82ae3" (UID: "6c381934-7d5f-429f-bf7c-294241e82ae3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293625 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znz47\" (UniqueName: \"kubernetes.io/projected/0047c990-60cf-4de3-9fe2-6b585c11a8fd-kube-api-access-znz47\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293658 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293669 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293677 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq96w\" (UniqueName: \"kubernetes.io/projected/6c381934-7d5f-429f-bf7c-294241e82ae3-kube-api-access-nq96w\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293686 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293694 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.293702 4904 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c381934-7d5f-429f-bf7c-294241e82ae3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.294008 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0047c990-60cf-4de3-9fe2-6b585c11a8fd" (UID: "0047c990-60cf-4de3-9fe2-6b585c11a8fd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.303490 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0047c990-60cf-4de3-9fe2-6b585c11a8fd" (UID: "0047c990-60cf-4de3-9fe2-6b585c11a8fd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.396105 4904 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.396165 4904 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0047c990-60cf-4de3-9fe2-6b585c11a8fd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.590327 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.622847 4904 generic.go:334] "Generic (PLEG): container finished" podID="1af764d9-33ef-4e99-8757-db2cd6705952" containerID="852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93" exitCode=0 Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.622943 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1af764d9-33ef-4e99-8757-db2cd6705952","Type":"ContainerDied","Data":"852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93"} Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.622971 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1af764d9-33ef-4e99-8757-db2cd6705952","Type":"ContainerDied","Data":"8e8b8e4c7794ef5b786faf3306292b7ab6a98a63c4e79ba57713b98be20a1ce9"} Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.622988 4904 scope.go:117] "RemoveContainer" containerID="852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.623194 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.631595 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c381934-7d5f-429f-bf7c-294241e82ae3","Type":"ContainerDied","Data":"0f88a2107e3ea8b328114c45cea6b873b83da9985fb2de07f1d507ea4cf38f89"} Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.631632 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.643472 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0047c990-60cf-4de3-9fe2-6b585c11a8fd","Type":"ContainerDied","Data":"a09d794d0ffc49e6f1c85a42e9ada0460978515778170593535995eb62a1be6e"} Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.643591 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.687453 4904 scope.go:117] "RemoveContainer" containerID="852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93" Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.687822 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93\": container with ID starting with 852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93 not found: ID does not exist" containerID="852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.687868 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93"} err="failed to get container status \"852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93\": rpc error: code = NotFound desc = could not find container \"852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93\": container with ID starting with 852547a7522a9fa284e99346464609049e45d8e535361e86d81c7129a9031d93 not found: ID does not exist" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.687900 4904 scope.go:117] "RemoveContainer" containerID="a7ee02175d8dd9b1ac2fd3b614fb0f3602487d1388e3a6b141f8a670b12429cd" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.707868 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpkhb\" (UniqueName: \"kubernetes.io/projected/1af764d9-33ef-4e99-8757-db2cd6705952-kube-api-access-zpkhb\") pod \"1af764d9-33ef-4e99-8757-db2cd6705952\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.714587 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af764d9-33ef-4e99-8757-db2cd6705952-kube-api-access-zpkhb" (OuterVolumeSpecName: "kube-api-access-zpkhb") pod "1af764d9-33ef-4e99-8757-db2cd6705952" (UID: "1af764d9-33ef-4e99-8757-db2cd6705952"). InnerVolumeSpecName "kube-api-access-zpkhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.719262 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-combined-ca-bundle\") pod \"1af764d9-33ef-4e99-8757-db2cd6705952\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.719379 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-config-data\") pod \"1af764d9-33ef-4e99-8757-db2cd6705952\" (UID: \"1af764d9-33ef-4e99-8757-db2cd6705952\") " Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.720011 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.720067 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.722170 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpkhb\" (UniqueName: \"kubernetes.io/projected/1af764d9-33ef-4e99-8757-db2cd6705952-kube-api-access-zpkhb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.763401 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1af764d9-33ef-4e99-8757-db2cd6705952" (UID: "1af764d9-33ef-4e99-8757-db2cd6705952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.777584 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.788587 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-config-data" (OuterVolumeSpecName: "config-data") pod "1af764d9-33ef-4e99-8757-db2cd6705952" (UID: "1af764d9-33ef-4e99-8757-db2cd6705952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.788952 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799110 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.799646 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-log" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799666 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-log" Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.799693 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-log" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799700 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-log" Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.799709 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af764d9-33ef-4e99-8757-db2cd6705952" containerName="nova-scheduler-scheduler" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799715 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af764d9-33ef-4e99-8757-db2cd6705952" containerName="nova-scheduler-scheduler" Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.799726 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-metadata" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799732 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-metadata" Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.799756 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-api" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799762 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-api" Dec 05 20:35:57 crc kubenswrapper[4904]: E1205 20:35:57.799772 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28682f8e-fc47-4366-bfdf-14c91d0d0aba" containerName="nova-manage" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799779 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="28682f8e-fc47-4366-bfdf-14c91d0d0aba" containerName="nova-manage" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799959 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-log" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799969 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" containerName="nova-api-api" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799983 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="28682f8e-fc47-4366-bfdf-14c91d0d0aba" containerName="nova-manage" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.799996 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-log" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.800007 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" containerName="nova-metadata-metadata" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.800026 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af764d9-33ef-4e99-8757-db2cd6705952" containerName="nova-scheduler-scheduler" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.801245 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.803557 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.805274 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.810410 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.813563 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.817708 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.822313 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.820273 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.822655 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.824670 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.824711 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af764d9-33ef-4e99-8757-db2cd6705952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.835278 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.889283 4904 scope.go:117] "RemoveContainer" containerID="dc7bbc8220797ffa8faf28635768dc41fddf0aac4d2a1edd6a35fa1247ad6296" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.910757 4904 scope.go:117] "RemoveContainer" containerID="816cc2c454acb008ef7821bf8531efc900eace41cb6fb0a4c6c07e82ab83a092" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926137 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-logs\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926213 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-logs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926232 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926249 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-config-data\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926266 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926331 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926538 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8v5q\" (UniqueName: \"kubernetes.io/projected/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-kube-api-access-z8v5q\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926642 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-config-data\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926687 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7zwp\" (UniqueName: \"kubernetes.io/projected/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-kube-api-access-w7zwp\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926836 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.926968 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.937565 4904 scope.go:117] "RemoveContainer" containerID="1fa5a105379034524408ff154eecfa631b71bff9f5f1a304031966639ce24b94" Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.966337 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.975466 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:57 crc kubenswrapper[4904]: I1205 20:35:57.989622 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.011047 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.014713 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.030384 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-logs\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033366 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-logs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033417 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033446 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033479 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-config-data\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033509 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033568 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8v5q\" (UniqueName: \"kubernetes.io/projected/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-kube-api-access-z8v5q\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033669 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-config-data\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033712 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7zwp\" (UniqueName: \"kubernetes.io/projected/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-kube-api-access-w7zwp\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033829 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.033883 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.034739 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-logs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.036356 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-logs\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.042614 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-config-data\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.043001 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.043025 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-config-data\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.043081 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.050658 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.050740 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.051709 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.053044 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7zwp\" (UniqueName: \"kubernetes.io/projected/be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0-kube-api-access-w7zwp\") pod \"nova-api-0\" (UID: \"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0\") " pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.053642 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8v5q\" (UniqueName: \"kubernetes.io/projected/eeb7ce1d-7b77-4a2f-9e87-99569f95995d-kube-api-access-z8v5q\") pod \"nova-metadata-0\" (UID: \"eeb7ce1d-7b77-4a2f-9e87-99569f95995d\") " pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.062710 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.139153 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.139663 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxs7\" (UniqueName: \"kubernetes.io/projected/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-kube-api-access-vkxs7\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.139760 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-config-data\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.195091 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.206286 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.241991 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.242100 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxs7\" (UniqueName: \"kubernetes.io/projected/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-kube-api-access-vkxs7\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.242184 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-config-data\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.246831 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-config-data\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.247164 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.265893 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxs7\" (UniqueName: \"kubernetes.io/projected/5ce87dae-66a4-4d60-bd13-3ac6a44abeed-kube-api-access-vkxs7\") pod \"nova-scheduler-0\" (UID: \"5ce87dae-66a4-4d60-bd13-3ac6a44abeed\") " pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.328981 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.677879 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.813677 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:35:58 crc kubenswrapper[4904]: I1205 20:35:58.961923 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:35:58 crc kubenswrapper[4904]: W1205 20:35:58.980617 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ce87dae_66a4_4d60_bd13_3ac6a44abeed.slice/crio-1b63b34cd2a0d8d00a2dfb6d4d579893a89d2e05945e4b5e4ab207646be95950 WatchSource:0}: Error finding container 1b63b34cd2a0d8d00a2dfb6d4d579893a89d2e05945e4b5e4ab207646be95950: Status 404 returned error can't find the container with id 1b63b34cd2a0d8d00a2dfb6d4d579893a89d2e05945e4b5e4ab207646be95950 Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.695383 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0047c990-60cf-4de3-9fe2-6b585c11a8fd" path="/var/lib/kubelet/pods/0047c990-60cf-4de3-9fe2-6b585c11a8fd/volumes" Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.696507 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af764d9-33ef-4e99-8757-db2cd6705952" path="/var/lib/kubelet/pods/1af764d9-33ef-4e99-8757-db2cd6705952/volumes" Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.697232 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c381934-7d5f-429f-bf7c-294241e82ae3" path="/var/lib/kubelet/pods/6c381934-7d5f-429f-bf7c-294241e82ae3/volumes" Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.699003 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5ce87dae-66a4-4d60-bd13-3ac6a44abeed","Type":"ContainerStarted","Data":"6c4f24fddd2d51be8c2059aaf096f6c8b125101dfa2cb80f3b6c842cb83ef1f7"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.699045 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5ce87dae-66a4-4d60-bd13-3ac6a44abeed","Type":"ContainerStarted","Data":"1b63b34cd2a0d8d00a2dfb6d4d579893a89d2e05945e4b5e4ab207646be95950"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.700398 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eeb7ce1d-7b77-4a2f-9e87-99569f95995d","Type":"ContainerStarted","Data":"e3dd4f0386b07b4d75c94069283391defb85e14c133f98971ba766a5dc008ca3"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.700421 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eeb7ce1d-7b77-4a2f-9e87-99569f95995d","Type":"ContainerStarted","Data":"835d43afe2c7839981a837d8e082886af7ac4b7b8104116914b958e15e4850ab"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.700430 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eeb7ce1d-7b77-4a2f-9e87-99569f95995d","Type":"ContainerStarted","Data":"3a902bb1bb82795447c0fbf2f7244941621116ffd631545fbfc1662465b48f77"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.706418 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0","Type":"ContainerStarted","Data":"f21767e4fc5628309800a500188d7f8efe09f436726cde9e90e32b61bb7d31e2"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.706445 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0","Type":"ContainerStarted","Data":"814be2b5398ca6cec948add9f7719b5b732be208bcee07094193ca996fdf019e"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.706456 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0","Type":"ContainerStarted","Data":"c0c9645e56e8e82866d9d6ae2826ac45a0eb8644bd158af86108478a23710f6c"} Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.722370 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.722354872 podStartE2EDuration="2.722354872s" podCreationTimestamp="2025-12-05 20:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:59.714031444 +0000 UTC m=+1458.525247553" watchObservedRunningTime="2025-12-05 20:35:59.722354872 +0000 UTC m=+1458.533570971" Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.739985 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.739964615 podStartE2EDuration="2.739964615s" podCreationTimestamp="2025-12-05 20:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:59.731589256 +0000 UTC m=+1458.542805375" watchObservedRunningTime="2025-12-05 20:35:59.739964615 +0000 UTC m=+1458.551180744" Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.769486 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.769463643 podStartE2EDuration="2.769463643s" podCreationTimestamp="2025-12-05 20:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:59.755430319 +0000 UTC m=+1458.566646488" watchObservedRunningTime="2025-12-05 20:35:59.769463643 +0000 UTC m=+1458.580679752" Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.958485 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:35:59 crc kubenswrapper[4904]: I1205 20:35:59.958567 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.679739 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qm5tb"] Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.682113 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.694389 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qm5tb"] Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.698627 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-utilities\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.699561 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-catalog-content\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.700003 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mxqb\" (UniqueName: \"kubernetes.io/projected/41242cb0-80f9-452d-8655-e838eed6f2cf-kube-api-access-5mxqb\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.802038 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mxqb\" (UniqueName: \"kubernetes.io/projected/41242cb0-80f9-452d-8655-e838eed6f2cf-kube-api-access-5mxqb\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.802187 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-utilities\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.802221 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-catalog-content\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.802723 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-catalog-content\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.802753 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-utilities\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:00 crc kubenswrapper[4904]: I1205 20:36:00.823248 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mxqb\" (UniqueName: \"kubernetes.io/projected/41242cb0-80f9-452d-8655-e838eed6f2cf-kube-api-access-5mxqb\") pod \"redhat-operators-qm5tb\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:01 crc kubenswrapper[4904]: I1205 20:36:01.011712 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:01 crc kubenswrapper[4904]: I1205 20:36:01.536521 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qm5tb"] Dec 05 20:36:01 crc kubenswrapper[4904]: W1205 20:36:01.538209 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41242cb0_80f9_452d_8655_e838eed6f2cf.slice/crio-6a5f6217b4e8f4781adee1f85a3e5b426357ba9bcf7723533ca71697c9618d16 WatchSource:0}: Error finding container 6a5f6217b4e8f4781adee1f85a3e5b426357ba9bcf7723533ca71697c9618d16: Status 404 returned error can't find the container with id 6a5f6217b4e8f4781adee1f85a3e5b426357ba9bcf7723533ca71697c9618d16 Dec 05 20:36:01 crc kubenswrapper[4904]: I1205 20:36:01.743449 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qm5tb" event={"ID":"41242cb0-80f9-452d-8655-e838eed6f2cf","Type":"ContainerStarted","Data":"6a5f6217b4e8f4781adee1f85a3e5b426357ba9bcf7723533ca71697c9618d16"} Dec 05 20:36:02 crc kubenswrapper[4904]: I1205 20:36:02.755117 4904 generic.go:334] "Generic (PLEG): container finished" podID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerID="9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca" exitCode=0 Dec 05 20:36:02 crc kubenswrapper[4904]: I1205 20:36:02.755178 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qm5tb" event={"ID":"41242cb0-80f9-452d-8655-e838eed6f2cf","Type":"ContainerDied","Data":"9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca"} Dec 05 20:36:03 crc kubenswrapper[4904]: I1205 20:36:03.196275 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:36:03 crc kubenswrapper[4904]: I1205 20:36:03.196319 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:36:03 crc kubenswrapper[4904]: I1205 20:36:03.330113 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 20:36:04 crc kubenswrapper[4904]: I1205 20:36:04.781212 4904 generic.go:334] "Generic (PLEG): container finished" podID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerID="7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904" exitCode=0 Dec 05 20:36:04 crc kubenswrapper[4904]: I1205 20:36:04.782226 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qm5tb" event={"ID":"41242cb0-80f9-452d-8655-e838eed6f2cf","Type":"ContainerDied","Data":"7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904"} Dec 05 20:36:05 crc kubenswrapper[4904]: I1205 20:36:05.796905 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qm5tb" event={"ID":"41242cb0-80f9-452d-8655-e838eed6f2cf","Type":"ContainerStarted","Data":"532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182"} Dec 05 20:36:05 crc kubenswrapper[4904]: I1205 20:36:05.821571 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qm5tb" podStartSLOduration=3.394330398 podStartE2EDuration="5.821551259s" podCreationTimestamp="2025-12-05 20:36:00 +0000 UTC" firstStartedPulling="2025-12-05 20:36:02.75696232 +0000 UTC m=+1461.568178439" lastFinishedPulling="2025-12-05 20:36:05.184183191 +0000 UTC m=+1463.995399300" observedRunningTime="2025-12-05 20:36:05.814207197 +0000 UTC m=+1464.625423326" watchObservedRunningTime="2025-12-05 20:36:05.821551259 +0000 UTC m=+1464.632767358" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.195876 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.196160 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.207479 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.207529 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.331283 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.367676 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 20:36:08 crc kubenswrapper[4904]: I1205 20:36:08.854477 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 20:36:09 crc kubenswrapper[4904]: I1205 20:36:09.207270 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eeb7ce1d-7b77-4a2f-9e87-99569f95995d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:36:09 crc kubenswrapper[4904]: I1205 20:36:09.207309 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eeb7ce1d-7b77-4a2f-9e87-99569f95995d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:36:09 crc kubenswrapper[4904]: I1205 20:36:09.219246 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:36:09 crc kubenswrapper[4904]: I1205 20:36:09.219275 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:36:11 crc kubenswrapper[4904]: I1205 20:36:11.012624 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:11 crc kubenswrapper[4904]: I1205 20:36:11.012698 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:12 crc kubenswrapper[4904]: I1205 20:36:12.078244 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qm5tb" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="registry-server" probeResult="failure" output=< Dec 05 20:36:12 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 20:36:12 crc kubenswrapper[4904]: > Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.204712 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.205479 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.210396 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.218531 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.219798 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.220035 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.229516 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.759536 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.933391 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.938238 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:36:18 crc kubenswrapper[4904]: I1205 20:36:18.943563 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:36:21 crc kubenswrapper[4904]: I1205 20:36:21.062720 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:21 crc kubenswrapper[4904]: I1205 20:36:21.132570 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:21 crc kubenswrapper[4904]: I1205 20:36:21.304556 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qm5tb"] Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.023555 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qm5tb" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="registry-server" containerID="cri-o://532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182" gracePeriod=2 Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.548545 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.710249 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-catalog-content\") pod \"41242cb0-80f9-452d-8655-e838eed6f2cf\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.710792 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mxqb\" (UniqueName: \"kubernetes.io/projected/41242cb0-80f9-452d-8655-e838eed6f2cf-kube-api-access-5mxqb\") pod \"41242cb0-80f9-452d-8655-e838eed6f2cf\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.710834 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-utilities\") pod \"41242cb0-80f9-452d-8655-e838eed6f2cf\" (UID: \"41242cb0-80f9-452d-8655-e838eed6f2cf\") " Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.711648 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-utilities" (OuterVolumeSpecName: "utilities") pod "41242cb0-80f9-452d-8655-e838eed6f2cf" (UID: "41242cb0-80f9-452d-8655-e838eed6f2cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.716663 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41242cb0-80f9-452d-8655-e838eed6f2cf-kube-api-access-5mxqb" (OuterVolumeSpecName: "kube-api-access-5mxqb") pod "41242cb0-80f9-452d-8655-e838eed6f2cf" (UID: "41242cb0-80f9-452d-8655-e838eed6f2cf"). InnerVolumeSpecName "kube-api-access-5mxqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.813747 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mxqb\" (UniqueName: \"kubernetes.io/projected/41242cb0-80f9-452d-8655-e838eed6f2cf-kube-api-access-5mxqb\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.813780 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.821177 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41242cb0-80f9-452d-8655-e838eed6f2cf" (UID: "41242cb0-80f9-452d-8655-e838eed6f2cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:23 crc kubenswrapper[4904]: I1205 20:36:23.915804 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41242cb0-80f9-452d-8655-e838eed6f2cf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.035973 4904 generic.go:334] "Generic (PLEG): container finished" podID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerID="532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182" exitCode=0 Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.036020 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qm5tb" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.036037 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qm5tb" event={"ID":"41242cb0-80f9-452d-8655-e838eed6f2cf","Type":"ContainerDied","Data":"532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182"} Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.036165 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qm5tb" event={"ID":"41242cb0-80f9-452d-8655-e838eed6f2cf","Type":"ContainerDied","Data":"6a5f6217b4e8f4781adee1f85a3e5b426357ba9bcf7723533ca71697c9618d16"} Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.036194 4904 scope.go:117] "RemoveContainer" containerID="532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.075228 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qm5tb"] Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.083447 4904 scope.go:117] "RemoveContainer" containerID="7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.088352 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qm5tb"] Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.103990 4904 scope.go:117] "RemoveContainer" containerID="9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.162656 4904 scope.go:117] "RemoveContainer" containerID="532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182" Dec 05 20:36:24 crc kubenswrapper[4904]: E1205 20:36:24.163273 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182\": container with ID starting with 532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182 not found: ID does not exist" containerID="532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.163307 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182"} err="failed to get container status \"532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182\": rpc error: code = NotFound desc = could not find container \"532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182\": container with ID starting with 532fbbaed5103ef897b3903ee8f386be933af735f4764a7dec4632f8e09de182 not found: ID does not exist" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.163329 4904 scope.go:117] "RemoveContainer" containerID="7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904" Dec 05 20:36:24 crc kubenswrapper[4904]: E1205 20:36:24.163761 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904\": container with ID starting with 7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904 not found: ID does not exist" containerID="7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.163810 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904"} err="failed to get container status \"7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904\": rpc error: code = NotFound desc = could not find container \"7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904\": container with ID starting with 7a85231ec8d2f71645e84ee363fd1b381a5f40b3991ef429250f0833a42a2904 not found: ID does not exist" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.163846 4904 scope.go:117] "RemoveContainer" containerID="9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca" Dec 05 20:36:24 crc kubenswrapper[4904]: E1205 20:36:24.164576 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca\": container with ID starting with 9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca not found: ID does not exist" containerID="9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca" Dec 05 20:36:24 crc kubenswrapper[4904]: I1205 20:36:24.164606 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca"} err="failed to get container status \"9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca\": rpc error: code = NotFound desc = could not find container \"9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca\": container with ID starting with 9aae8652aa09fe2b2c3ed9c8e53a6375674fbfe91df263762a7031a0b99ae7ca not found: ID does not exist" Dec 05 20:36:25 crc kubenswrapper[4904]: I1205 20:36:25.693724 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" path="/var/lib/kubelet/pods/41242cb0-80f9-452d-8655-e838eed6f2cf/volumes" Dec 05 20:36:28 crc kubenswrapper[4904]: I1205 20:36:28.667355 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:36:29 crc kubenswrapper[4904]: I1205 20:36:29.948036 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:36:29 crc kubenswrapper[4904]: I1205 20:36:29.955712 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:36:29 crc kubenswrapper[4904]: I1205 20:36:29.955766 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:36:29 crc kubenswrapper[4904]: I1205 20:36:29.955808 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:36:29 crc kubenswrapper[4904]: I1205 20:36:29.956538 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a95f83a14a35574f3a4cccf8ae06571e374650e21ab79c52d42113e0e6de1c7"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:36:29 crc kubenswrapper[4904]: I1205 20:36:29.956596 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://3a95f83a14a35574f3a4cccf8ae06571e374650e21ab79c52d42113e0e6de1c7" gracePeriod=600 Dec 05 20:36:30 crc kubenswrapper[4904]: I1205 20:36:30.113774 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="3a95f83a14a35574f3a4cccf8ae06571e374650e21ab79c52d42113e0e6de1c7" exitCode=0 Dec 05 20:36:30 crc kubenswrapper[4904]: I1205 20:36:30.113871 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"3a95f83a14a35574f3a4cccf8ae06571e374650e21ab79c52d42113e0e6de1c7"} Dec 05 20:36:30 crc kubenswrapper[4904]: I1205 20:36:30.114137 4904 scope.go:117] "RemoveContainer" containerID="528bac05d1388e0a638d4e21540a1e20bd15f6ec7b5238ce57fc2bc9737af1ba" Dec 05 20:36:31 crc kubenswrapper[4904]: I1205 20:36:31.128793 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1"} Dec 05 20:36:32 crc kubenswrapper[4904]: I1205 20:36:32.056261 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="rabbitmq" containerID="cri-o://3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb" gracePeriod=604797 Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.147493 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="rabbitmq" containerID="cri-o://aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8" gracePeriod=604797 Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.686236 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811180 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-tls\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811252 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-config-data\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811325 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-erlang-cookie-secret\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811428 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811488 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbsl2\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-kube-api-access-cbsl2\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811517 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-plugins-conf\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811619 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811643 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-pod-info\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811720 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-erlang-cookie\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811736 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-plugins\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.811780 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-server-conf\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.813909 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.815284 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.817749 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.824973 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-kube-api-access-cbsl2" (OuterVolumeSpecName: "kube-api-access-cbsl2") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "kube-api-access-cbsl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.825142 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.829177 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.831926 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-pod-info" (OuterVolumeSpecName: "pod-info") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.842235 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.854073 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-config-data" (OuterVolumeSpecName: "config-data") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.892309 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-server-conf" (OuterVolumeSpecName: "server-conf") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916779 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916802 4904 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916823 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916832 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbsl2\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-kube-api-access-cbsl2\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916840 4904 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916847 4904 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916854 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916862 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916870 4904 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.916878 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4904]: I1205 20:36:33.941404 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.018366 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.018687 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd\") pod \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\" (UID: \"921f0ddc-4d15-4bfa-9560-7a01eaa3461f\") " Dec 05 20:36:34 crc kubenswrapper[4904]: W1205 20:36:34.018815 4904 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/921f0ddc-4d15-4bfa-9560-7a01eaa3461f/volumes/kubernetes.io~projected/rabbitmq-confd Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.018844 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "921f0ddc-4d15-4bfa-9560-7a01eaa3461f" (UID: "921f0ddc-4d15-4bfa-9560-7a01eaa3461f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.019144 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.019164 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/921f0ddc-4d15-4bfa-9560-7a01eaa3461f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.167293 4904 generic.go:334] "Generic (PLEG): container finished" podID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerID="3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb" exitCode=0 Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.167481 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"921f0ddc-4d15-4bfa-9560-7a01eaa3461f","Type":"ContainerDied","Data":"3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb"} Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.167539 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.167592 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"921f0ddc-4d15-4bfa-9560-7a01eaa3461f","Type":"ContainerDied","Data":"38861b6e79d90e746aa66c2f536a6eb4525ce0b4c93b922e4ad66d0d0ae9261e"} Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.167606 4904 scope.go:117] "RemoveContainer" containerID="3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.212686 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.222171 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.225775 4904 scope.go:117] "RemoveContainer" containerID="2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.265826 4904 scope.go:117] "RemoveContainer" containerID="3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb" Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.266365 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb\": container with ID starting with 3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb not found: ID does not exist" containerID="3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.266404 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb"} err="failed to get container status \"3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb\": rpc error: code = NotFound desc = could not find container \"3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb\": container with ID starting with 3425d73c996f99cdeb6580db311aae79e48c21975ce2a8b0d132688204f144bb not found: ID does not exist" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.266430 4904 scope.go:117] "RemoveContainer" containerID="2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a" Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.266870 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a\": container with ID starting with 2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a not found: ID does not exist" containerID="2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.266904 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a"} err="failed to get container status \"2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a\": rpc error: code = NotFound desc = could not find container \"2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a\": container with ID starting with 2b9723d67e3da9257f671f85fca08c79229416b8645d9074df450b11cc9cb52a not found: ID does not exist" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.266953 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.267435 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="extract-utilities" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267459 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="extract-utilities" Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.267485 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="rabbitmq" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267495 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="rabbitmq" Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.267519 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="extract-content" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267528 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="extract-content" Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.267556 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="setup-container" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267563 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="setup-container" Dec 05 20:36:34 crc kubenswrapper[4904]: E1205 20:36:34.267577 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="registry-server" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267585 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="registry-server" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267805 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" containerName="rabbitmq" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.267826 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="41242cb0-80f9-452d-8655-e838eed6f2cf" containerName="registry-server" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.269127 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.273621 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.273918 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.274281 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.274455 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9glvh" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.274732 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.274915 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.275080 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.280657 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.425983 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-config-data\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426022 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426048 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426093 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426110 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426131 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426149 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426170 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e45678d-877d-4c34-a8f5-913d31a8b79d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426211 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lj4z\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-kube-api-access-7lj4z\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426255 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e45678d-877d-4c34-a8f5-913d31a8b79d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.426275 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527743 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e45678d-877d-4c34-a8f5-913d31a8b79d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527790 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527856 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-config-data\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527882 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527907 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527932 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527948 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527971 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.527990 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.528013 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e45678d-877d-4c34-a8f5-913d31a8b79d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.528069 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lj4z\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-kube-api-access-7lj4z\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.529161 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.530166 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.530736 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.530948 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-config-data\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.531267 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.531490 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e45678d-877d-4c34-a8f5-913d31a8b79d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.535808 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e45678d-877d-4c34-a8f5-913d31a8b79d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.535876 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.536405 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.549134 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lj4z\" (UniqueName: \"kubernetes.io/projected/1e45678d-877d-4c34-a8f5-913d31a8b79d-kube-api-access-7lj4z\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.551390 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e45678d-877d-4c34-a8f5-913d31a8b79d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:34 crc kubenswrapper[4904]: I1205 20:36:34.596315 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1e45678d-877d-4c34-a8f5-913d31a8b79d\") " pod="openstack/rabbitmq-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.735787 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.790414 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.945846 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-confd\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946222 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-plugins\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946265 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ad24986-23a3-4010-8dcf-6778339691c8-pod-info\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946299 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-server-conf\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946352 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-269gx\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-kube-api-access-269gx\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946377 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-tls\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946416 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-plugins-conf\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946478 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ad24986-23a3-4010-8dcf-6778339691c8-erlang-cookie-secret\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946525 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946549 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-config-data\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.946906 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-erlang-cookie\") pod \"0ad24986-23a3-4010-8dcf-6778339691c8\" (UID: \"0ad24986-23a3-4010-8dcf-6778339691c8\") " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.947192 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.947507 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.948008 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.950877 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad24986-23a3-4010-8dcf-6778339691c8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.951698 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-kube-api-access-269gx" (OuterVolumeSpecName: "kube-api-access-269gx") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "kube-api-access-269gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953108 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953383 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953403 4904 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ad24986-23a3-4010-8dcf-6778339691c8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953486 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953504 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953518 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953529 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-269gx\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-kube-api-access-269gx\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953538 4904 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.953547 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0ad24986-23a3-4010-8dcf-6778339691c8-pod-info" (OuterVolumeSpecName: "pod-info") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.978941 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-config-data" (OuterVolumeSpecName: "config-data") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:34.989521 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.031026 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-server-conf" (OuterVolumeSpecName: "server-conf") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.057627 4904 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ad24986-23a3-4010-8dcf-6778339691c8-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.057658 4904 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.057666 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.057674 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.057684 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad24986-23a3-4010-8dcf-6778339691c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.094776 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0ad24986-23a3-4010-8dcf-6778339691c8" (UID: "0ad24986-23a3-4010-8dcf-6778339691c8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.159715 4904 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ad24986-23a3-4010-8dcf-6778339691c8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.178859 4904 generic.go:334] "Generic (PLEG): container finished" podID="0ad24986-23a3-4010-8dcf-6778339691c8" containerID="aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8" exitCode=0 Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.178901 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ad24986-23a3-4010-8dcf-6778339691c8","Type":"ContainerDied","Data":"aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8"} Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.178942 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ad24986-23a3-4010-8dcf-6778339691c8","Type":"ContainerDied","Data":"a658b6fddf2b66d38dd44fc30b45f3a866ca1f7b23a859f0ae3683115ca85024"} Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.178961 4904 scope.go:117] "RemoveContainer" containerID="aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.179024 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.220973 4904 scope.go:117] "RemoveContainer" containerID="430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.231982 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.269496 4904 scope.go:117] "RemoveContainer" containerID="aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8" Dec 05 20:36:35 crc kubenswrapper[4904]: E1205 20:36:35.269951 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8\": container with ID starting with aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8 not found: ID does not exist" containerID="aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.269976 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8"} err="failed to get container status \"aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8\": rpc error: code = NotFound desc = could not find container \"aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8\": container with ID starting with aaad2dce99bbd67b7a9457b86cd6991da7a24cd85acd1e2bf6f24866edcfded8 not found: ID does not exist" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.270005 4904 scope.go:117] "RemoveContainer" containerID="430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.270085 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:36:35 crc kubenswrapper[4904]: E1205 20:36:35.270352 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65\": container with ID starting with 430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65 not found: ID does not exist" containerID="430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.270379 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65"} err="failed to get container status \"430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65\": rpc error: code = NotFound desc = could not find container \"430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65\": container with ID starting with 430dbd00dcef1faa04366e06dfa54515d84f826b7421c7f90e8fbe45ef9d7e65 not found: ID does not exist" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.290138 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:36:35 crc kubenswrapper[4904]: E1205 20:36:35.291029 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="rabbitmq" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.291049 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="rabbitmq" Dec 05 20:36:35 crc kubenswrapper[4904]: E1205 20:36:35.291095 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="setup-container" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.291105 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="setup-container" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.291401 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" containerName="rabbitmq" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.292829 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.295712 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.295910 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.295995 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.296044 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.296190 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.296315 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.296390 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-c25p2" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.321575 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.367878 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368187 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368233 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368251 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59b2l\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-kube-api-access-59b2l\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368270 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69046049-50b5-4ced-8afa-5ef3405aad24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368317 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368345 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368400 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368439 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368494 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.368529 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69046049-50b5-4ced-8afa-5ef3405aad24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470420 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470510 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470582 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69046049-50b5-4ced-8afa-5ef3405aad24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470612 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470667 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470746 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470767 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59b2l\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-kube-api-access-59b2l\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470807 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69046049-50b5-4ced-8afa-5ef3405aad24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470846 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470871 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470895 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470960 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.470967 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.471211 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.472100 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.472490 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.475248 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69046049-50b5-4ced-8afa-5ef3405aad24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.475479 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.476471 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.477227 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69046049-50b5-4ced-8afa-5ef3405aad24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.487637 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69046049-50b5-4ced-8afa-5ef3405aad24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.489890 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59b2l\" (UniqueName: \"kubernetes.io/projected/69046049-50b5-4ced-8afa-5ef3405aad24-kube-api-access-59b2l\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.512115 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"69046049-50b5-4ced-8afa-5ef3405aad24\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.663316 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.674262 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.696306 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad24986-23a3-4010-8dcf-6778339691c8" path="/var/lib/kubelet/pods/0ad24986-23a3-4010-8dcf-6778339691c8/volumes" Dec 05 20:36:35 crc kubenswrapper[4904]: I1205 20:36:35.698430 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="921f0ddc-4d15-4bfa-9560-7a01eaa3461f" path="/var/lib/kubelet/pods/921f0ddc-4d15-4bfa-9560-7a01eaa3461f/volumes" Dec 05 20:36:36 crc kubenswrapper[4904]: I1205 20:36:36.115168 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:36:36 crc kubenswrapper[4904]: W1205 20:36:36.118165 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69046049_50b5_4ced_8afa_5ef3405aad24.slice/crio-e68b6c4df2684229975ce36e06e5a1f523bb8a858ebbd0b1183bb3b174926715 WatchSource:0}: Error finding container e68b6c4df2684229975ce36e06e5a1f523bb8a858ebbd0b1183bb3b174926715: Status 404 returned error can't find the container with id e68b6c4df2684229975ce36e06e5a1f523bb8a858ebbd0b1183bb3b174926715 Dec 05 20:36:36 crc kubenswrapper[4904]: I1205 20:36:36.197409 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69046049-50b5-4ced-8afa-5ef3405aad24","Type":"ContainerStarted","Data":"e68b6c4df2684229975ce36e06e5a1f523bb8a858ebbd0b1183bb3b174926715"} Dec 05 20:36:36 crc kubenswrapper[4904]: I1205 20:36:36.198674 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1e45678d-877d-4c34-a8f5-913d31a8b79d","Type":"ContainerStarted","Data":"ff1991891e6facd42cd760820a1ff7eadbec210b3612945b3783f4c33d66cfaf"} Dec 05 20:36:38 crc kubenswrapper[4904]: I1205 20:36:38.226500 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1e45678d-877d-4c34-a8f5-913d31a8b79d","Type":"ContainerStarted","Data":"e9ad7314523353714ce37e64ce7de2c3306ca6348c73f9e898bf143cb3c59b89"} Dec 05 20:36:38 crc kubenswrapper[4904]: I1205 20:36:38.228250 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69046049-50b5-4ced-8afa-5ef3405aad24","Type":"ContainerStarted","Data":"41c65f961697709c50b3d568be247edc22869de64234fc01ac4e219de2a02a2b"} Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.105502 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8fd85ff87-vg5t6"] Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.111911 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.114266 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.119609 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8fd85ff87-vg5t6"] Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212215 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k695d\" (UniqueName: \"kubernetes.io/projected/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-kube-api-access-k695d\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212654 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-config\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212690 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-svc\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212726 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-sb\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212749 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-nb\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212788 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-swift-storage-0\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.212807 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314424 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-nb\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314515 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-swift-storage-0\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314550 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314598 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k695d\" (UniqueName: \"kubernetes.io/projected/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-kube-api-access-k695d\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314726 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-config\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314773 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-svc\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.314838 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-sb\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.315823 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-sb\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.316500 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-nb\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.317131 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-swift-storage-0\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.317706 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.318631 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-config\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.319259 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-svc\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.350261 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k695d\" (UniqueName: \"kubernetes.io/projected/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-kube-api-access-k695d\") pod \"dnsmasq-dns-8fd85ff87-vg5t6\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.496903 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:42 crc kubenswrapper[4904]: W1205 20:36:42.988849 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe7e020_6a1e_4e4d_bef2_160791a1cdf2.slice/crio-08fad0b488fcd66eecf4d7945ba0f434bc04276ca9272a87105b27f101678827 WatchSource:0}: Error finding container 08fad0b488fcd66eecf4d7945ba0f434bc04276ca9272a87105b27f101678827: Status 404 returned error can't find the container with id 08fad0b488fcd66eecf4d7945ba0f434bc04276ca9272a87105b27f101678827 Dec 05 20:36:42 crc kubenswrapper[4904]: I1205 20:36:42.989038 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8fd85ff87-vg5t6"] Dec 05 20:36:43 crc kubenswrapper[4904]: I1205 20:36:43.276445 4904 generic.go:334] "Generic (PLEG): container finished" podID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerID="8c922c3ff26cd903060f421124cf6ff82f582e88deeb9965bb1e06cdfa91f2c7" exitCode=0 Dec 05 20:36:43 crc kubenswrapper[4904]: I1205 20:36:43.276519 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" event={"ID":"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2","Type":"ContainerDied","Data":"8c922c3ff26cd903060f421124cf6ff82f582e88deeb9965bb1e06cdfa91f2c7"} Dec 05 20:36:43 crc kubenswrapper[4904]: I1205 20:36:43.276771 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" event={"ID":"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2","Type":"ContainerStarted","Data":"08fad0b488fcd66eecf4d7945ba0f434bc04276ca9272a87105b27f101678827"} Dec 05 20:36:44 crc kubenswrapper[4904]: I1205 20:36:44.290811 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" event={"ID":"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2","Type":"ContainerStarted","Data":"f3987befa90bcc576750edef5398b9307827d4866f5f2189d9f70223c9d4abbf"} Dec 05 20:36:44 crc kubenswrapper[4904]: I1205 20:36:44.292449 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:44 crc kubenswrapper[4904]: I1205 20:36:44.325597 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" podStartSLOduration=2.32557428 podStartE2EDuration="2.32557428s" podCreationTimestamp="2025-12-05 20:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:44.325397465 +0000 UTC m=+1503.136613584" watchObservedRunningTime="2025-12-05 20:36:44.32557428 +0000 UTC m=+1503.136790409" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.499817 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.595126 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7774fc8c79-ms4p7"] Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.595402 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" containerName="dnsmasq-dns" containerID="cri-o://447a5335e06188527c763f74532686b32d439ef0e380513c3b2d6af75e0bf8d1" gracePeriod=10 Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.715250 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b7c6b4c7-wk76m"] Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.716846 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.734587 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b7c6b4c7-wk76m"] Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.870881 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zktfb\" (UniqueName: \"kubernetes.io/projected/0aa075ff-9799-456e-b08e-5146d8e11c06-kube-api-access-zktfb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.871309 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-config\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.871388 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-openstack-edpm-ipam\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.871571 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-ovsdbserver-nb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.871690 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-dns-swift-storage-0\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.871774 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-ovsdbserver-sb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.871921 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-dns-svc\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974503 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-ovsdbserver-sb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974574 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-dns-svc\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974607 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zktfb\" (UniqueName: \"kubernetes.io/projected/0aa075ff-9799-456e-b08e-5146d8e11c06-kube-api-access-zktfb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974660 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-config\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974710 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-openstack-edpm-ipam\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974905 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-ovsdbserver-nb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.974931 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-dns-swift-storage-0\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.975734 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-dns-swift-storage-0\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.976170 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-ovsdbserver-sb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.976204 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-dns-svc\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.976343 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-openstack-edpm-ipam\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.976722 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-ovsdbserver-nb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.976924 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa075ff-9799-456e-b08e-5146d8e11c06-config\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:52 crc kubenswrapper[4904]: I1205 20:36:52.995644 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zktfb\" (UniqueName: \"kubernetes.io/projected/0aa075ff-9799-456e-b08e-5146d8e11c06-kube-api-access-zktfb\") pod \"dnsmasq-dns-b7c6b4c7-wk76m\" (UID: \"0aa075ff-9799-456e-b08e-5146d8e11c06\") " pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:53 crc kubenswrapper[4904]: I1205 20:36:53.081720 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:53 crc kubenswrapper[4904]: I1205 20:36:53.375582 4904 generic.go:334] "Generic (PLEG): container finished" podID="a252ec50-71cf-4e22-b84a-8c247b695354" containerID="447a5335e06188527c763f74532686b32d439ef0e380513c3b2d6af75e0bf8d1" exitCode=0 Dec 05 20:36:53 crc kubenswrapper[4904]: I1205 20:36:53.375646 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" event={"ID":"a252ec50-71cf-4e22-b84a-8c247b695354","Type":"ContainerDied","Data":"447a5335e06188527c763f74532686b32d439ef0e380513c3b2d6af75e0bf8d1"} Dec 05 20:36:53 crc kubenswrapper[4904]: I1205 20:36:53.601672 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b7c6b4c7-wk76m"] Dec 05 20:36:53 crc kubenswrapper[4904]: W1205 20:36:53.606337 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aa075ff_9799_456e_b08e_5146d8e11c06.slice/crio-c9fb0c1570326a1ac22934bcba36520dc2b1fa168509a16cc8dd40257d1e6418 WatchSource:0}: Error finding container c9fb0c1570326a1ac22934bcba36520dc2b1fa168509a16cc8dd40257d1e6418: Status 404 returned error can't find the container with id c9fb0c1570326a1ac22934bcba36520dc2b1fa168509a16cc8dd40257d1e6418 Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.146678 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.207102 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-swift-storage-0\") pod \"a252ec50-71cf-4e22-b84a-8c247b695354\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.207483 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-sb\") pod \"a252ec50-71cf-4e22-b84a-8c247b695354\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.207618 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vln5\" (UniqueName: \"kubernetes.io/projected/a252ec50-71cf-4e22-b84a-8c247b695354-kube-api-access-5vln5\") pod \"a252ec50-71cf-4e22-b84a-8c247b695354\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.207644 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-nb\") pod \"a252ec50-71cf-4e22-b84a-8c247b695354\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.207822 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-config\") pod \"a252ec50-71cf-4e22-b84a-8c247b695354\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.207866 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-svc\") pod \"a252ec50-71cf-4e22-b84a-8c247b695354\" (UID: \"a252ec50-71cf-4e22-b84a-8c247b695354\") " Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.222309 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a252ec50-71cf-4e22-b84a-8c247b695354-kube-api-access-5vln5" (OuterVolumeSpecName: "kube-api-access-5vln5") pod "a252ec50-71cf-4e22-b84a-8c247b695354" (UID: "a252ec50-71cf-4e22-b84a-8c247b695354"). InnerVolumeSpecName "kube-api-access-5vln5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.269828 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a252ec50-71cf-4e22-b84a-8c247b695354" (UID: "a252ec50-71cf-4e22-b84a-8c247b695354"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.275692 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-config" (OuterVolumeSpecName: "config") pod "a252ec50-71cf-4e22-b84a-8c247b695354" (UID: "a252ec50-71cf-4e22-b84a-8c247b695354"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.299939 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a252ec50-71cf-4e22-b84a-8c247b695354" (UID: "a252ec50-71cf-4e22-b84a-8c247b695354"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.302516 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a252ec50-71cf-4e22-b84a-8c247b695354" (UID: "a252ec50-71cf-4e22-b84a-8c247b695354"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.310755 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.310797 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.310808 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.310817 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vln5\" (UniqueName: \"kubernetes.io/projected/a252ec50-71cf-4e22-b84a-8c247b695354-kube-api-access-5vln5\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.310827 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.310967 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a252ec50-71cf-4e22-b84a-8c247b695354" (UID: "a252ec50-71cf-4e22-b84a-8c247b695354"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.385425 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.385440 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7774fc8c79-ms4p7" event={"ID":"a252ec50-71cf-4e22-b84a-8c247b695354","Type":"ContainerDied","Data":"dc19ec09fbfd3ce1c468eae52471ba79dcb5f5ab12fdfb8c014905790306487c"} Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.385512 4904 scope.go:117] "RemoveContainer" containerID="447a5335e06188527c763f74532686b32d439ef0e380513c3b2d6af75e0bf8d1" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.387891 4904 generic.go:334] "Generic (PLEG): container finished" podID="0aa075ff-9799-456e-b08e-5146d8e11c06" containerID="e712af119f985562247e8766ec6f8017143832e2f261a9e778192ebe24380e0e" exitCode=0 Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.387920 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" event={"ID":"0aa075ff-9799-456e-b08e-5146d8e11c06","Type":"ContainerDied","Data":"e712af119f985562247e8766ec6f8017143832e2f261a9e778192ebe24380e0e"} Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.387938 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" event={"ID":"0aa075ff-9799-456e-b08e-5146d8e11c06","Type":"ContainerStarted","Data":"c9fb0c1570326a1ac22934bcba36520dc2b1fa168509a16cc8dd40257d1e6418"} Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.410781 4904 scope.go:117] "RemoveContainer" containerID="621247310fcf03b400878d476b417299fd649660f3a7d2ab3202740dad2e9dc2" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.412389 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a252ec50-71cf-4e22-b84a-8c247b695354-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.439168 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7774fc8c79-ms4p7"] Dec 05 20:36:54 crc kubenswrapper[4904]: I1205 20:36:54.447955 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7774fc8c79-ms4p7"] Dec 05 20:36:55 crc kubenswrapper[4904]: I1205 20:36:55.401041 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" event={"ID":"0aa075ff-9799-456e-b08e-5146d8e11c06","Type":"ContainerStarted","Data":"e4b2a15727efed17b3ab39e3df433919fda7601f1b75d421ad1fe335da3d6e95"} Dec 05 20:36:55 crc kubenswrapper[4904]: I1205 20:36:55.401417 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:36:55 crc kubenswrapper[4904]: I1205 20:36:55.425999 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" podStartSLOduration=3.425981411 podStartE2EDuration="3.425981411s" podCreationTimestamp="2025-12-05 20:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:55.423873733 +0000 UTC m=+1514.235089852" watchObservedRunningTime="2025-12-05 20:36:55.425981411 +0000 UTC m=+1514.237197510" Dec 05 20:36:55 crc kubenswrapper[4904]: I1205 20:36:55.694226 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" path="/var/lib/kubelet/pods/a252ec50-71cf-4e22-b84a-8c247b695354/volumes" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.083593 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b7c6b4c7-wk76m" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.169596 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8fd85ff87-vg5t6"] Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.169888 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerName="dnsmasq-dns" containerID="cri-o://f3987befa90bcc576750edef5398b9307827d4866f5f2189d9f70223c9d4abbf" gracePeriod=10 Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.511583 4904 generic.go:334] "Generic (PLEG): container finished" podID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerID="f3987befa90bcc576750edef5398b9307827d4866f5f2189d9f70223c9d4abbf" exitCode=0 Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.511699 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" event={"ID":"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2","Type":"ContainerDied","Data":"f3987befa90bcc576750edef5398b9307827d4866f5f2189d9f70223c9d4abbf"} Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.687645 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.718455 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-sb\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.718513 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-nb\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.784102 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.787140 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.820207 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-swift-storage-0\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.820274 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-openstack-edpm-ipam\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.820307 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-config\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.820396 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k695d\" (UniqueName: \"kubernetes.io/projected/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-kube-api-access-k695d\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.820485 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-svc\") pod \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\" (UID: \"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2\") " Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.821032 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.821073 4904 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.825586 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-kube-api-access-k695d" (OuterVolumeSpecName: "kube-api-access-k695d") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "kube-api-access-k695d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.877604 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-config" (OuterVolumeSpecName: "config") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.890115 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.895027 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.897545 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" (UID: "2fe7e020-6a1e-4e4d-bef2-160791a1cdf2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.922805 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.922843 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.922854 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k695d\" (UniqueName: \"kubernetes.io/projected/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-kube-api-access-k695d\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.922864 4904 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:03 crc kubenswrapper[4904]: I1205 20:37:03.922873 4904 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:04 crc kubenswrapper[4904]: I1205 20:37:04.524862 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" event={"ID":"2fe7e020-6a1e-4e4d-bef2-160791a1cdf2","Type":"ContainerDied","Data":"08fad0b488fcd66eecf4d7945ba0f434bc04276ca9272a87105b27f101678827"} Dec 05 20:37:04 crc kubenswrapper[4904]: I1205 20:37:04.524963 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fd85ff87-vg5t6" Dec 05 20:37:04 crc kubenswrapper[4904]: I1205 20:37:04.525218 4904 scope.go:117] "RemoveContainer" containerID="f3987befa90bcc576750edef5398b9307827d4866f5f2189d9f70223c9d4abbf" Dec 05 20:37:04 crc kubenswrapper[4904]: I1205 20:37:04.551114 4904 scope.go:117] "RemoveContainer" containerID="8c922c3ff26cd903060f421124cf6ff82f582e88deeb9965bb1e06cdfa91f2c7" Dec 05 20:37:04 crc kubenswrapper[4904]: I1205 20:37:04.571172 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8fd85ff87-vg5t6"] Dec 05 20:37:04 crc kubenswrapper[4904]: I1205 20:37:04.580869 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8fd85ff87-vg5t6"] Dec 05 20:37:05 crc kubenswrapper[4904]: I1205 20:37:05.700733 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" path="/var/lib/kubelet/pods/2fe7e020-6a1e-4e4d-bef2-160791a1cdf2/volumes" Dec 05 20:37:10 crc kubenswrapper[4904]: I1205 20:37:10.596585 4904 generic.go:334] "Generic (PLEG): container finished" podID="1e45678d-877d-4c34-a8f5-913d31a8b79d" containerID="e9ad7314523353714ce37e64ce7de2c3306ca6348c73f9e898bf143cb3c59b89" exitCode=0 Dec 05 20:37:10 crc kubenswrapper[4904]: I1205 20:37:10.596661 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1e45678d-877d-4c34-a8f5-913d31a8b79d","Type":"ContainerDied","Data":"e9ad7314523353714ce37e64ce7de2c3306ca6348c73f9e898bf143cb3c59b89"} Dec 05 20:37:10 crc kubenswrapper[4904]: I1205 20:37:10.599470 4904 generic.go:334] "Generic (PLEG): container finished" podID="69046049-50b5-4ced-8afa-5ef3405aad24" containerID="41c65f961697709c50b3d568be247edc22869de64234fc01ac4e219de2a02a2b" exitCode=0 Dec 05 20:37:10 crc kubenswrapper[4904]: I1205 20:37:10.599518 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69046049-50b5-4ced-8afa-5ef3405aad24","Type":"ContainerDied","Data":"41c65f961697709c50b3d568be247edc22869de64234fc01ac4e219de2a02a2b"} Dec 05 20:37:11 crc kubenswrapper[4904]: I1205 20:37:11.610639 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1e45678d-877d-4c34-a8f5-913d31a8b79d","Type":"ContainerStarted","Data":"08590096a8e18a03d34fd2dc63df7d5e990bc97aac822c0a31ff8c946a999770"} Dec 05 20:37:11 crc kubenswrapper[4904]: I1205 20:37:11.611162 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 20:37:11 crc kubenswrapper[4904]: I1205 20:37:11.612702 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69046049-50b5-4ced-8afa-5ef3405aad24","Type":"ContainerStarted","Data":"2d10959fdd5964403918443b037dc116a612ee2877934c7e8ec37671e587e98f"} Dec 05 20:37:11 crc kubenswrapper[4904]: I1205 20:37:11.612926 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:37:11 crc kubenswrapper[4904]: I1205 20:37:11.638791 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.638757102 podStartE2EDuration="37.638757102s" podCreationTimestamp="2025-12-05 20:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:37:11.630024963 +0000 UTC m=+1530.441241082" watchObservedRunningTime="2025-12-05 20:37:11.638757102 +0000 UTC m=+1530.449973211" Dec 05 20:37:11 crc kubenswrapper[4904]: I1205 20:37:11.669894 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.669818333 podStartE2EDuration="36.669818333s" podCreationTimestamp="2025-12-05 20:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:37:11.652356275 +0000 UTC m=+1530.463572404" watchObservedRunningTime="2025-12-05 20:37:11.669818333 +0000 UTC m=+1530.481034442" Dec 05 20:37:16 crc kubenswrapper[4904]: I1205 20:37:16.840352 4904 scope.go:117] "RemoveContainer" containerID="9bd6982949de74604d2b98a2e96167ee2081d2881e85a4cf5ec787dfedaa9751" Dec 05 20:37:16 crc kubenswrapper[4904]: I1205 20:37:16.874571 4904 scope.go:117] "RemoveContainer" containerID="bc6bd55a707cd7c462889940bc35e995374b4c3995de80745ca6bb1ed256e0d0" Dec 05 20:37:16 crc kubenswrapper[4904]: I1205 20:37:16.902467 4904 scope.go:117] "RemoveContainer" containerID="45a22d9de21cad981079d2e7659a670ac38e664200c113b9e41923eb1c83a7b9" Dec 05 20:37:16 crc kubenswrapper[4904]: I1205 20:37:16.922914 4904 scope.go:117] "RemoveContainer" containerID="724c237984bc633fedf69b88fe8220d15d3dbd1aa807fdf1295dd0b034d151eb" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.236335 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8"] Dec 05 20:37:21 crc kubenswrapper[4904]: E1205 20:37:21.237218 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" containerName="dnsmasq-dns" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.237231 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" containerName="dnsmasq-dns" Dec 05 20:37:21 crc kubenswrapper[4904]: E1205 20:37:21.237245 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerName="init" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.237251 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerName="init" Dec 05 20:37:21 crc kubenswrapper[4904]: E1205 20:37:21.237261 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerName="dnsmasq-dns" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.237267 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerName="dnsmasq-dns" Dec 05 20:37:21 crc kubenswrapper[4904]: E1205 20:37:21.237293 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" containerName="init" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.237299 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" containerName="init" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.237478 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="a252ec50-71cf-4e22-b84a-8c247b695354" containerName="dnsmasq-dns" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.237496 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe7e020-6a1e-4e4d-bef2-160791a1cdf2" containerName="dnsmasq-dns" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.238201 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.241297 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.241524 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.242992 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.243313 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.252905 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8"] Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.367079 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.367300 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.367417 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpzvh\" (UniqueName: \"kubernetes.io/projected/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-kube-api-access-cpzvh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.367448 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.469086 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpzvh\" (UniqueName: \"kubernetes.io/projected/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-kube-api-access-cpzvh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.469151 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.469174 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.469282 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.480048 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.480161 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.480203 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.493698 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpzvh\" (UniqueName: \"kubernetes.io/projected/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-kube-api-access-cpzvh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:21 crc kubenswrapper[4904]: I1205 20:37:21.560281 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:22 crc kubenswrapper[4904]: I1205 20:37:22.158763 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8"] Dec 05 20:37:22 crc kubenswrapper[4904]: W1205 20:37:22.167646 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc12b173c_79c1_4bcf_a76a_b3bc84b9b556.slice/crio-faff90dc91f4387928d8cf66efe4f86b22f7c9d3a644a4c26c4d05bd201b8454 WatchSource:0}: Error finding container faff90dc91f4387928d8cf66efe4f86b22f7c9d3a644a4c26c4d05bd201b8454: Status 404 returned error can't find the container with id faff90dc91f4387928d8cf66efe4f86b22f7c9d3a644a4c26c4d05bd201b8454 Dec 05 20:37:22 crc kubenswrapper[4904]: I1205 20:37:22.729151 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" event={"ID":"c12b173c-79c1-4bcf-a76a-b3bc84b9b556","Type":"ContainerStarted","Data":"faff90dc91f4387928d8cf66efe4f86b22f7c9d3a644a4c26c4d05bd201b8454"} Dec 05 20:37:24 crc kubenswrapper[4904]: I1205 20:37:24.739247 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 20:37:25 crc kubenswrapper[4904]: I1205 20:37:25.666292 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:37:32 crc kubenswrapper[4904]: I1205 20:37:32.858340 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" event={"ID":"c12b173c-79c1-4bcf-a76a-b3bc84b9b556","Type":"ContainerStarted","Data":"e8e90bc6bf87b5b6c5ac24d3b55ced74027f8bf468515dc83b3eebf46d8f7292"} Dec 05 20:37:32 crc kubenswrapper[4904]: I1205 20:37:32.876608 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" podStartSLOduration=2.278892995 podStartE2EDuration="11.876588122s" podCreationTimestamp="2025-12-05 20:37:21 +0000 UTC" firstStartedPulling="2025-12-05 20:37:22.172280167 +0000 UTC m=+1540.983496266" lastFinishedPulling="2025-12-05 20:37:31.769975284 +0000 UTC m=+1550.581191393" observedRunningTime="2025-12-05 20:37:32.87432702 +0000 UTC m=+1551.685543129" watchObservedRunningTime="2025-12-05 20:37:32.876588122 +0000 UTC m=+1551.687804231" Dec 05 20:37:42 crc kubenswrapper[4904]: I1205 20:37:42.987622 4904 generic.go:334] "Generic (PLEG): container finished" podID="c12b173c-79c1-4bcf-a76a-b3bc84b9b556" containerID="e8e90bc6bf87b5b6c5ac24d3b55ced74027f8bf468515dc83b3eebf46d8f7292" exitCode=0 Dec 05 20:37:42 crc kubenswrapper[4904]: I1205 20:37:42.987717 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" event={"ID":"c12b173c-79c1-4bcf-a76a-b3bc84b9b556","Type":"ContainerDied","Data":"e8e90bc6bf87b5b6c5ac24d3b55ced74027f8bf468515dc83b3eebf46d8f7292"} Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.421749 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.503404 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-inventory\") pod \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.503571 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpzvh\" (UniqueName: \"kubernetes.io/projected/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-kube-api-access-cpzvh\") pod \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.503647 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-ssh-key\") pod \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.503714 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-repo-setup-combined-ca-bundle\") pod \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\" (UID: \"c12b173c-79c1-4bcf-a76a-b3bc84b9b556\") " Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.509965 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c12b173c-79c1-4bcf-a76a-b3bc84b9b556" (UID: "c12b173c-79c1-4bcf-a76a-b3bc84b9b556"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.514487 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-kube-api-access-cpzvh" (OuterVolumeSpecName: "kube-api-access-cpzvh") pod "c12b173c-79c1-4bcf-a76a-b3bc84b9b556" (UID: "c12b173c-79c1-4bcf-a76a-b3bc84b9b556"). InnerVolumeSpecName "kube-api-access-cpzvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.539664 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-inventory" (OuterVolumeSpecName: "inventory") pod "c12b173c-79c1-4bcf-a76a-b3bc84b9b556" (UID: "c12b173c-79c1-4bcf-a76a-b3bc84b9b556"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.554886 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c12b173c-79c1-4bcf-a76a-b3bc84b9b556" (UID: "c12b173c-79c1-4bcf-a76a-b3bc84b9b556"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.605417 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.605451 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpzvh\" (UniqueName: \"kubernetes.io/projected/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-kube-api-access-cpzvh\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.605464 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:44 crc kubenswrapper[4904]: I1205 20:37:44.605474 4904 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12b173c-79c1-4bcf-a76a-b3bc84b9b556-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.012571 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" event={"ID":"c12b173c-79c1-4bcf-a76a-b3bc84b9b556","Type":"ContainerDied","Data":"faff90dc91f4387928d8cf66efe4f86b22f7c9d3a644a4c26c4d05bd201b8454"} Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.012618 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faff90dc91f4387928d8cf66efe4f86b22f7c9d3a644a4c26c4d05bd201b8454" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.012698 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.081287 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d"] Dec 05 20:37:45 crc kubenswrapper[4904]: E1205 20:37:45.081765 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12b173c-79c1-4bcf-a76a-b3bc84b9b556" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.081784 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12b173c-79c1-4bcf-a76a-b3bc84b9b556" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.081979 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12b173c-79c1-4bcf-a76a-b3bc84b9b556" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.082710 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.085141 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.086651 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.086701 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.087122 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.101230 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d"] Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.228986 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.229074 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.229236 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88dh\" (UniqueName: \"kubernetes.io/projected/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-kube-api-access-h88dh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.330721 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.330790 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.330829 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88dh\" (UniqueName: \"kubernetes.io/projected/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-kube-api-access-h88dh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.335944 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.338607 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.348254 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88dh\" (UniqueName: \"kubernetes.io/projected/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-kube-api-access-h88dh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tc59d\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.398086 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:45 crc kubenswrapper[4904]: I1205 20:37:45.906322 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d"] Dec 05 20:37:46 crc kubenswrapper[4904]: I1205 20:37:46.022177 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" event={"ID":"cc55dc7a-28f4-47eb-8c57-58e949a98dcc","Type":"ContainerStarted","Data":"bdd3df3466591b378de380e3e85e7b363efa43ec58318a6c65c47e036e847b7d"} Dec 05 20:37:47 crc kubenswrapper[4904]: I1205 20:37:47.031897 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" event={"ID":"cc55dc7a-28f4-47eb-8c57-58e949a98dcc","Type":"ContainerStarted","Data":"ae3b612c329acc6299eb3c399d4de9f68e724781568b0ece90d105f030c3a718"} Dec 05 20:37:47 crc kubenswrapper[4904]: I1205 20:37:47.053311 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" podStartSLOduration=1.643239156 podStartE2EDuration="2.053290473s" podCreationTimestamp="2025-12-05 20:37:45 +0000 UTC" firstStartedPulling="2025-12-05 20:37:45.915264975 +0000 UTC m=+1564.726481084" lastFinishedPulling="2025-12-05 20:37:46.325316282 +0000 UTC m=+1565.136532401" observedRunningTime="2025-12-05 20:37:47.050830046 +0000 UTC m=+1565.862046165" watchObservedRunningTime="2025-12-05 20:37:47.053290473 +0000 UTC m=+1565.864506602" Dec 05 20:37:50 crc kubenswrapper[4904]: I1205 20:37:50.078617 4904 generic.go:334] "Generic (PLEG): container finished" podID="cc55dc7a-28f4-47eb-8c57-58e949a98dcc" containerID="ae3b612c329acc6299eb3c399d4de9f68e724781568b0ece90d105f030c3a718" exitCode=0 Dec 05 20:37:50 crc kubenswrapper[4904]: I1205 20:37:50.078692 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" event={"ID":"cc55dc7a-28f4-47eb-8c57-58e949a98dcc","Type":"ContainerDied","Data":"ae3b612c329acc6299eb3c399d4de9f68e724781568b0ece90d105f030c3a718"} Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.551399 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.665389 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-inventory\") pod \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.665647 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h88dh\" (UniqueName: \"kubernetes.io/projected/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-kube-api-access-h88dh\") pod \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.665693 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-ssh-key\") pod \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\" (UID: \"cc55dc7a-28f4-47eb-8c57-58e949a98dcc\") " Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.671949 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-kube-api-access-h88dh" (OuterVolumeSpecName: "kube-api-access-h88dh") pod "cc55dc7a-28f4-47eb-8c57-58e949a98dcc" (UID: "cc55dc7a-28f4-47eb-8c57-58e949a98dcc"). InnerVolumeSpecName "kube-api-access-h88dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.695555 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc55dc7a-28f4-47eb-8c57-58e949a98dcc" (UID: "cc55dc7a-28f4-47eb-8c57-58e949a98dcc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.710655 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-inventory" (OuterVolumeSpecName: "inventory") pod "cc55dc7a-28f4-47eb-8c57-58e949a98dcc" (UID: "cc55dc7a-28f4-47eb-8c57-58e949a98dcc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.768600 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h88dh\" (UniqueName: \"kubernetes.io/projected/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-kube-api-access-h88dh\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.768622 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:51 crc kubenswrapper[4904]: I1205 20:37:51.768630 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc55dc7a-28f4-47eb-8c57-58e949a98dcc-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.107488 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" event={"ID":"cc55dc7a-28f4-47eb-8c57-58e949a98dcc","Type":"ContainerDied","Data":"bdd3df3466591b378de380e3e85e7b363efa43ec58318a6c65c47e036e847b7d"} Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.107530 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdd3df3466591b378de380e3e85e7b363efa43ec58318a6c65c47e036e847b7d" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.107832 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tc59d" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.158457 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-524nx"] Dec 05 20:37:52 crc kubenswrapper[4904]: E1205 20:37:52.159134 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc55dc7a-28f4-47eb-8c57-58e949a98dcc" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.159352 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc55dc7a-28f4-47eb-8c57-58e949a98dcc" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.159759 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc55dc7a-28f4-47eb-8c57-58e949a98dcc" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.161516 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.180563 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-524nx"] Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.255447 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8"] Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.257205 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.259689 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.259978 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.262855 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.263253 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.276447 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8"] Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.277611 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt97b\" (UniqueName: \"kubernetes.io/projected/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-kube-api-access-bt97b\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.277741 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-catalog-content\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.277777 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-utilities\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.379759 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.379881 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt97b\" (UniqueName: \"kubernetes.io/projected/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-kube-api-access-bt97b\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.379923 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.379959 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.380027 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-catalog-content\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.380071 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk894\" (UniqueName: \"kubernetes.io/projected/2277bc86-3475-44fd-a77d-9a2f552bb457-kube-api-access-lk894\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.380099 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-utilities\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.380724 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-utilities\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.381268 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-catalog-content\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.397460 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt97b\" (UniqueName: \"kubernetes.io/projected/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-kube-api-access-bt97b\") pod \"community-operators-524nx\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.481951 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.482101 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.482136 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.482204 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk894\" (UniqueName: \"kubernetes.io/projected/2277bc86-3475-44fd-a77d-9a2f552bb457-kube-api-access-lk894\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.482994 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.486238 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.487221 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.487860 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.511631 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk894\" (UniqueName: \"kubernetes.io/projected/2277bc86-3475-44fd-a77d-9a2f552bb457-kube-api-access-lk894\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:52 crc kubenswrapper[4904]: I1205 20:37:52.574844 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:37:53 crc kubenswrapper[4904]: I1205 20:37:53.106288 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-524nx"] Dec 05 20:37:53 crc kubenswrapper[4904]: I1205 20:37:53.119918 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerStarted","Data":"df7ba4579d8eddca8eb70ae7c9b1afdd41795fdf12ce507057af99fcac77553b"} Dec 05 20:37:53 crc kubenswrapper[4904]: I1205 20:37:53.289481 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8"] Dec 05 20:37:53 crc kubenswrapper[4904]: W1205 20:37:53.297878 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2277bc86_3475_44fd_a77d_9a2f552bb457.slice/crio-838ec81f57f99ed3e63488c04d0ac10a01bc6c81d2dc0a000e6e33331c27968d WatchSource:0}: Error finding container 838ec81f57f99ed3e63488c04d0ac10a01bc6c81d2dc0a000e6e33331c27968d: Status 404 returned error can't find the container with id 838ec81f57f99ed3e63488c04d0ac10a01bc6c81d2dc0a000e6e33331c27968d Dec 05 20:37:54 crc kubenswrapper[4904]: I1205 20:37:54.129704 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" event={"ID":"2277bc86-3475-44fd-a77d-9a2f552bb457","Type":"ContainerStarted","Data":"dcf596a5d7e5720d87ab6c83417a571c52c3108c1025f3508f819a61f7132e6a"} Dec 05 20:37:54 crc kubenswrapper[4904]: I1205 20:37:54.130035 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" event={"ID":"2277bc86-3475-44fd-a77d-9a2f552bb457","Type":"ContainerStarted","Data":"838ec81f57f99ed3e63488c04d0ac10a01bc6c81d2dc0a000e6e33331c27968d"} Dec 05 20:37:54 crc kubenswrapper[4904]: I1205 20:37:54.131636 4904 generic.go:334] "Generic (PLEG): container finished" podID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerID="a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f" exitCode=0 Dec 05 20:37:54 crc kubenswrapper[4904]: I1205 20:37:54.131675 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerDied","Data":"a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f"} Dec 05 20:37:54 crc kubenswrapper[4904]: I1205 20:37:54.150821 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" podStartSLOduration=1.74289158 podStartE2EDuration="2.150801749s" podCreationTimestamp="2025-12-05 20:37:52 +0000 UTC" firstStartedPulling="2025-12-05 20:37:53.300136316 +0000 UTC m=+1572.111352425" lastFinishedPulling="2025-12-05 20:37:53.708046485 +0000 UTC m=+1572.519262594" observedRunningTime="2025-12-05 20:37:54.14317206 +0000 UTC m=+1572.954388189" watchObservedRunningTime="2025-12-05 20:37:54.150801749 +0000 UTC m=+1572.962017858" Dec 05 20:37:55 crc kubenswrapper[4904]: I1205 20:37:55.148487 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerStarted","Data":"bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e"} Dec 05 20:37:56 crc kubenswrapper[4904]: I1205 20:37:56.162898 4904 generic.go:334] "Generic (PLEG): container finished" podID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerID="bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e" exitCode=0 Dec 05 20:37:56 crc kubenswrapper[4904]: I1205 20:37:56.162982 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerDied","Data":"bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e"} Dec 05 20:37:58 crc kubenswrapper[4904]: I1205 20:37:58.183220 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerStarted","Data":"50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841"} Dec 05 20:37:58 crc kubenswrapper[4904]: I1205 20:37:58.212012 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-524nx" podStartSLOduration=2.824689948 podStartE2EDuration="6.211989561s" podCreationTimestamp="2025-12-05 20:37:52 +0000 UTC" firstStartedPulling="2025-12-05 20:37:54.132899509 +0000 UTC m=+1572.944115618" lastFinishedPulling="2025-12-05 20:37:57.520199112 +0000 UTC m=+1576.331415231" observedRunningTime="2025-12-05 20:37:58.200691361 +0000 UTC m=+1577.011907500" watchObservedRunningTime="2025-12-05 20:37:58.211989561 +0000 UTC m=+1577.023205680" Dec 05 20:38:02 crc kubenswrapper[4904]: I1205 20:38:02.484620 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:38:02 crc kubenswrapper[4904]: I1205 20:38:02.485233 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:38:02 crc kubenswrapper[4904]: I1205 20:38:02.532956 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:38:03 crc kubenswrapper[4904]: I1205 20:38:03.312706 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:38:03 crc kubenswrapper[4904]: I1205 20:38:03.407999 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-524nx"] Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.259494 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-524nx" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="registry-server" containerID="cri-o://50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841" gracePeriod=2 Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.749510 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.825036 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt97b\" (UniqueName: \"kubernetes.io/projected/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-kube-api-access-bt97b\") pod \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.825239 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-utilities\") pod \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.825264 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-catalog-content\") pod \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\" (UID: \"c8336d8e-3601-4530-84a3-fb35c1ad6ca2\") " Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.825970 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-utilities" (OuterVolumeSpecName: "utilities") pod "c8336d8e-3601-4530-84a3-fb35c1ad6ca2" (UID: "c8336d8e-3601-4530-84a3-fb35c1ad6ca2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.830314 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-kube-api-access-bt97b" (OuterVolumeSpecName: "kube-api-access-bt97b") pod "c8336d8e-3601-4530-84a3-fb35c1ad6ca2" (UID: "c8336d8e-3601-4530-84a3-fb35c1ad6ca2"). InnerVolumeSpecName "kube-api-access-bt97b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.871593 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8336d8e-3601-4530-84a3-fb35c1ad6ca2" (UID: "c8336d8e-3601-4530-84a3-fb35c1ad6ca2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.927462 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt97b\" (UniqueName: \"kubernetes.io/projected/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-kube-api-access-bt97b\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.927776 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:05 crc kubenswrapper[4904]: I1205 20:38:05.927789 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8336d8e-3601-4530-84a3-fb35c1ad6ca2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.270366 4904 generic.go:334] "Generic (PLEG): container finished" podID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerID="50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841" exitCode=0 Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.270416 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerDied","Data":"50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841"} Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.270432 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524nx" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.270446 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524nx" event={"ID":"c8336d8e-3601-4530-84a3-fb35c1ad6ca2","Type":"ContainerDied","Data":"df7ba4579d8eddca8eb70ae7c9b1afdd41795fdf12ce507057af99fcac77553b"} Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.270495 4904 scope.go:117] "RemoveContainer" containerID="50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.292355 4904 scope.go:117] "RemoveContainer" containerID="bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.309603 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-524nx"] Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.319295 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-524nx"] Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.340486 4904 scope.go:117] "RemoveContainer" containerID="a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.372632 4904 scope.go:117] "RemoveContainer" containerID="50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841" Dec 05 20:38:06 crc kubenswrapper[4904]: E1205 20:38:06.373220 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841\": container with ID starting with 50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841 not found: ID does not exist" containerID="50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.373248 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841"} err="failed to get container status \"50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841\": rpc error: code = NotFound desc = could not find container \"50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841\": container with ID starting with 50ee1b41c4012b09a72c8d67422d5e7503144426855d6e1390b4cd0a89c85841 not found: ID does not exist" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.373283 4904 scope.go:117] "RemoveContainer" containerID="bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e" Dec 05 20:38:06 crc kubenswrapper[4904]: E1205 20:38:06.373608 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e\": container with ID starting with bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e not found: ID does not exist" containerID="bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.373659 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e"} err="failed to get container status \"bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e\": rpc error: code = NotFound desc = could not find container \"bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e\": container with ID starting with bd3238cdd9ffaf4da652f7a745c001f89dab7d10a1b2d2d79e479addc208915e not found: ID does not exist" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.373673 4904 scope.go:117] "RemoveContainer" containerID="a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f" Dec 05 20:38:06 crc kubenswrapper[4904]: E1205 20:38:06.373944 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f\": container with ID starting with a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f not found: ID does not exist" containerID="a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f" Dec 05 20:38:06 crc kubenswrapper[4904]: I1205 20:38:06.373967 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f"} err="failed to get container status \"a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f\": rpc error: code = NotFound desc = could not find container \"a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f\": container with ID starting with a9a0bbbd74b0743586e15106021636b322e1331e19fc3a47c8a5368e1749c91f not found: ID does not exist" Dec 05 20:38:07 crc kubenswrapper[4904]: I1205 20:38:07.697112 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" path="/var/lib/kubelet/pods/c8336d8e-3601-4530-84a3-fb35c1ad6ca2/volumes" Dec 05 20:38:17 crc kubenswrapper[4904]: I1205 20:38:17.082233 4904 scope.go:117] "RemoveContainer" containerID="cd624dd8dedee8ad675e84e8bb0260b665f03b36bb5f85caaf69b04d1a8bffcf" Dec 05 20:38:17 crc kubenswrapper[4904]: I1205 20:38:17.110074 4904 scope.go:117] "RemoveContainer" containerID="5c6250321ba72cb24e5f7c65d6de2e6e678f16ef9ef5b36c9bbe06fcd8b2073e" Dec 05 20:38:59 crc kubenswrapper[4904]: I1205 20:38:59.955843 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:38:59 crc kubenswrapper[4904]: I1205 20:38:59.957606 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:39:17 crc kubenswrapper[4904]: I1205 20:39:17.286990 4904 scope.go:117] "RemoveContainer" containerID="f2627d2e6a793ad96a31df3f93d55432e268348872a9265f0c6eb7c6070f7bfa" Dec 05 20:39:29 crc kubenswrapper[4904]: I1205 20:39:29.956129 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:39:29 crc kubenswrapper[4904]: I1205 20:39:29.956910 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:39:59 crc kubenswrapper[4904]: I1205 20:39:59.955476 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:39:59 crc kubenswrapper[4904]: I1205 20:39:59.957146 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:39:59 crc kubenswrapper[4904]: I1205 20:39:59.957251 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:39:59 crc kubenswrapper[4904]: I1205 20:39:59.957988 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:39:59 crc kubenswrapper[4904]: I1205 20:39:59.958136 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" gracePeriod=600 Dec 05 20:40:00 crc kubenswrapper[4904]: E1205 20:40:00.135918 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:40:00 crc kubenswrapper[4904]: I1205 20:40:00.580999 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" exitCode=0 Dec 05 20:40:00 crc kubenswrapper[4904]: I1205 20:40:00.581071 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1"} Dec 05 20:40:00 crc kubenswrapper[4904]: I1205 20:40:00.581327 4904 scope.go:117] "RemoveContainer" containerID="3a95f83a14a35574f3a4cccf8ae06571e374650e21ab79c52d42113e0e6de1c7" Dec 05 20:40:00 crc kubenswrapper[4904]: I1205 20:40:00.581996 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:40:00 crc kubenswrapper[4904]: E1205 20:40:00.582299 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:40:11 crc kubenswrapper[4904]: I1205 20:40:11.689023 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:40:11 crc kubenswrapper[4904]: E1205 20:40:11.689841 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:40:25 crc kubenswrapper[4904]: I1205 20:40:25.681751 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:40:25 crc kubenswrapper[4904]: E1205 20:40:25.682664 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:40:39 crc kubenswrapper[4904]: I1205 20:40:39.681649 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:40:39 crc kubenswrapper[4904]: E1205 20:40:39.682417 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:40:53 crc kubenswrapper[4904]: I1205 20:40:53.681694 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:40:53 crc kubenswrapper[4904]: E1205 20:40:53.682834 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:41:08 crc kubenswrapper[4904]: I1205 20:41:08.683442 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:41:08 crc kubenswrapper[4904]: E1205 20:41:08.684228 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:41:14 crc kubenswrapper[4904]: I1205 20:41:14.439834 4904 generic.go:334] "Generic (PLEG): container finished" podID="2277bc86-3475-44fd-a77d-9a2f552bb457" containerID="dcf596a5d7e5720d87ab6c83417a571c52c3108c1025f3508f819a61f7132e6a" exitCode=0 Dec 05 20:41:14 crc kubenswrapper[4904]: I1205 20:41:14.439987 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" event={"ID":"2277bc86-3475-44fd-a77d-9a2f552bb457","Type":"ContainerDied","Data":"dcf596a5d7e5720d87ab6c83417a571c52c3108c1025f3508f819a61f7132e6a"} Dec 05 20:41:15 crc kubenswrapper[4904]: I1205 20:41:15.889254 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.003937 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-bootstrap-combined-ca-bundle\") pod \"2277bc86-3475-44fd-a77d-9a2f552bb457\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.004167 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-inventory\") pod \"2277bc86-3475-44fd-a77d-9a2f552bb457\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.004277 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-ssh-key\") pod \"2277bc86-3475-44fd-a77d-9a2f552bb457\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.004426 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk894\" (UniqueName: \"kubernetes.io/projected/2277bc86-3475-44fd-a77d-9a2f552bb457-kube-api-access-lk894\") pod \"2277bc86-3475-44fd-a77d-9a2f552bb457\" (UID: \"2277bc86-3475-44fd-a77d-9a2f552bb457\") " Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.010877 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2277bc86-3475-44fd-a77d-9a2f552bb457-kube-api-access-lk894" (OuterVolumeSpecName: "kube-api-access-lk894") pod "2277bc86-3475-44fd-a77d-9a2f552bb457" (UID: "2277bc86-3475-44fd-a77d-9a2f552bb457"). InnerVolumeSpecName "kube-api-access-lk894". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.011256 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2277bc86-3475-44fd-a77d-9a2f552bb457" (UID: "2277bc86-3475-44fd-a77d-9a2f552bb457"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.037697 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-inventory" (OuterVolumeSpecName: "inventory") pod "2277bc86-3475-44fd-a77d-9a2f552bb457" (UID: "2277bc86-3475-44fd-a77d-9a2f552bb457"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.039824 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2277bc86-3475-44fd-a77d-9a2f552bb457" (UID: "2277bc86-3475-44fd-a77d-9a2f552bb457"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.107943 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.107997 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.108016 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk894\" (UniqueName: \"kubernetes.io/projected/2277bc86-3475-44fd-a77d-9a2f552bb457-kube-api-access-lk894\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.108036 4904 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2277bc86-3475-44fd-a77d-9a2f552bb457-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.465145 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" event={"ID":"2277bc86-3475-44fd-a77d-9a2f552bb457","Type":"ContainerDied","Data":"838ec81f57f99ed3e63488c04d0ac10a01bc6c81d2dc0a000e6e33331c27968d"} Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.465471 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="838ec81f57f99ed3e63488c04d0ac10a01bc6c81d2dc0a000e6e33331c27968d" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.465271 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.596497 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4"] Dec 05 20:41:16 crc kubenswrapper[4904]: E1205 20:41:16.597435 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="registry-server" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.597456 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="registry-server" Dec 05 20:41:16 crc kubenswrapper[4904]: E1205 20:41:16.597474 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="extract-utilities" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.597481 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="extract-utilities" Dec 05 20:41:16 crc kubenswrapper[4904]: E1205 20:41:16.597494 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="extract-content" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.597500 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="extract-content" Dec 05 20:41:16 crc kubenswrapper[4904]: E1205 20:41:16.597515 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2277bc86-3475-44fd-a77d-9a2f552bb457" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.597523 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2277bc86-3475-44fd-a77d-9a2f552bb457" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.597877 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="2277bc86-3475-44fd-a77d-9a2f552bb457" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.597905 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8336d8e-3601-4530-84a3-fb35c1ad6ca2" containerName="registry-server" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.598788 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.602347 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.603695 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.603935 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.605212 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.619569 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4"] Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.721183 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.721333 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.721370 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwkd\" (UniqueName: \"kubernetes.io/projected/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-kube-api-access-4dwkd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.823660 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.823875 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.823927 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwkd\" (UniqueName: \"kubernetes.io/projected/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-kube-api-access-4dwkd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.830036 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.844618 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.848972 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwkd\" (UniqueName: \"kubernetes.io/projected/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-kube-api-access-4dwkd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:16 crc kubenswrapper[4904]: I1205 20:41:16.950258 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:41:17 crc kubenswrapper[4904]: I1205 20:41:17.411726 4904 scope.go:117] "RemoveContainer" containerID="f0f9ae5c03aec2962370e8f71363d77b279f1c0ebd2e18773180fb5f37d9361d" Dec 05 20:41:17 crc kubenswrapper[4904]: I1205 20:41:17.441740 4904 scope.go:117] "RemoveContainer" containerID="d2f232c93cb3e4e4268e4b753b952516e49808cd8f34e03bdb66945b5d2f0dd6" Dec 05 20:41:17 crc kubenswrapper[4904]: I1205 20:41:17.467084 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4"] Dec 05 20:41:17 crc kubenswrapper[4904]: I1205 20:41:17.480255 4904 scope.go:117] "RemoveContainer" containerID="e20b1c5635af73ad3f022b2250a6d02a0f2beaac421118e9022ea3e53ce02e30" Dec 05 20:41:17 crc kubenswrapper[4904]: W1205 20:41:17.490588 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b8cd007_58c8_4fd2_924d_a8d24608ff6c.slice/crio-3323e767b54dc0e8746a206b4fb2adbcff61a78b1b591201220c618b428be536 WatchSource:0}: Error finding container 3323e767b54dc0e8746a206b4fb2adbcff61a78b1b591201220c618b428be536: Status 404 returned error can't find the container with id 3323e767b54dc0e8746a206b4fb2adbcff61a78b1b591201220c618b428be536 Dec 05 20:41:17 crc kubenswrapper[4904]: I1205 20:41:17.494282 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.042741 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6kxzb"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.053074 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pmt5n"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.062726 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6a25-account-create-update-tn8bx"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.073703 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pmt5n"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.081647 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e624-account-create-update-x5jhz"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.089758 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6a25-account-create-update-tn8bx"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.097492 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6kxzb"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.104524 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e624-account-create-update-x5jhz"] Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.495614 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" event={"ID":"2b8cd007-58c8-4fd2-924d-a8d24608ff6c","Type":"ContainerStarted","Data":"f556567b4766f4e5d1078731b261a4839f89f8eebf37559fec9aae38aafdfa03"} Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.495927 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" event={"ID":"2b8cd007-58c8-4fd2-924d-a8d24608ff6c","Type":"ContainerStarted","Data":"3323e767b54dc0e8746a206b4fb2adbcff61a78b1b591201220c618b428be536"} Dec 05 20:41:18 crc kubenswrapper[4904]: I1205 20:41:18.525650 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" podStartSLOduration=2.041928732 podStartE2EDuration="2.525592965s" podCreationTimestamp="2025-12-05 20:41:16 +0000 UTC" firstStartedPulling="2025-12-05 20:41:17.493873545 +0000 UTC m=+1776.305089664" lastFinishedPulling="2025-12-05 20:41:17.977537768 +0000 UTC m=+1776.788753897" observedRunningTime="2025-12-05 20:41:18.513001428 +0000 UTC m=+1777.324217547" watchObservedRunningTime="2025-12-05 20:41:18.525592965 +0000 UTC m=+1777.336809094" Dec 05 20:41:19 crc kubenswrapper[4904]: I1205 20:41:19.695337 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c5a51a5-3628-495b-a5c2-f29846ae9eb4" path="/var/lib/kubelet/pods/7c5a51a5-3628-495b-a5c2-f29846ae9eb4/volumes" Dec 05 20:41:19 crc kubenswrapper[4904]: I1205 20:41:19.696383 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc955d4-a981-42e4-b327-558d95c0a9c0" path="/var/lib/kubelet/pods/8fc955d4-a981-42e4-b327-558d95c0a9c0/volumes" Dec 05 20:41:19 crc kubenswrapper[4904]: I1205 20:41:19.697365 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a294d371-9e50-48f4-9774-bfb2f8014ea6" path="/var/lib/kubelet/pods/a294d371-9e50-48f4-9774-bfb2f8014ea6/volumes" Dec 05 20:41:19 crc kubenswrapper[4904]: I1205 20:41:19.698007 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e" path="/var/lib/kubelet/pods/d0e2f8b9-44dd-4a78-9ce8-8e9f401be67e/volumes" Dec 05 20:41:21 crc kubenswrapper[4904]: I1205 20:41:21.041793 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-bg2sh"] Dec 05 20:41:21 crc kubenswrapper[4904]: I1205 20:41:21.054706 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-7ffc-account-create-update-wbqpp"] Dec 05 20:41:21 crc kubenswrapper[4904]: I1205 20:41:21.064081 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-7ffc-account-create-update-wbqpp"] Dec 05 20:41:21 crc kubenswrapper[4904]: I1205 20:41:21.073427 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-bg2sh"] Dec 05 20:41:21 crc kubenswrapper[4904]: I1205 20:41:21.694744 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346e1889-b00c-462f-8b7f-dbcbc159f6ae" path="/var/lib/kubelet/pods/346e1889-b00c-462f-8b7f-dbcbc159f6ae/volumes" Dec 05 20:41:21 crc kubenswrapper[4904]: I1205 20:41:21.695466 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4852903-4c6d-47bf-a507-0c1f700d3470" path="/var/lib/kubelet/pods/c4852903-4c6d-47bf-a507-0c1f700d3470/volumes" Dec 05 20:41:23 crc kubenswrapper[4904]: I1205 20:41:23.681821 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:41:23 crc kubenswrapper[4904]: E1205 20:41:23.682362 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:41:37 crc kubenswrapper[4904]: I1205 20:41:37.681500 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:41:37 crc kubenswrapper[4904]: E1205 20:41:37.682604 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:41:48 crc kubenswrapper[4904]: I1205 20:41:48.681646 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:41:48 crc kubenswrapper[4904]: E1205 20:41:48.683407 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:41:59 crc kubenswrapper[4904]: I1205 20:41:59.682333 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:41:59 crc kubenswrapper[4904]: E1205 20:41:59.683198 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:42:05 crc kubenswrapper[4904]: I1205 20:42:05.043540 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6wgsf"] Dec 05 20:42:05 crc kubenswrapper[4904]: I1205 20:42:05.055840 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6wgsf"] Dec 05 20:42:05 crc kubenswrapper[4904]: I1205 20:42:05.694343 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b02093-4f8b-4c58-be23-1bcda58307c2" path="/var/lib/kubelet/pods/14b02093-4f8b-4c58-be23-1bcda58307c2/volumes" Dec 05 20:42:12 crc kubenswrapper[4904]: I1205 20:42:12.035419 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-npdbj"] Dec 05 20:42:12 crc kubenswrapper[4904]: I1205 20:42:12.048688 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-npdbj"] Dec 05 20:42:12 crc kubenswrapper[4904]: I1205 20:42:12.681357 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:42:12 crc kubenswrapper[4904]: E1205 20:42:12.681892 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.058024 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hgvtd"] Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.076241 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7b45-account-create-update-dzcld"] Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.096555 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2bc2-account-create-update-f95zs"] Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.113598 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hgvtd"] Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.122999 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2bc2-account-create-update-f95zs"] Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.132324 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7b45-account-create-update-dzcld"] Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.695249 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0451215c-c2fd-4113-b130-948cde7a8537" path="/var/lib/kubelet/pods/0451215c-c2fd-4113-b130-948cde7a8537/volumes" Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.696039 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c42e4b0-e039-4945-8c9c-fe12766434bd" path="/var/lib/kubelet/pods/5c42e4b0-e039-4945-8c9c-fe12766434bd/volumes" Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.696812 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78120578-1e01-44dd-b0ed-908e6d0103df" path="/var/lib/kubelet/pods/78120578-1e01-44dd-b0ed-908e6d0103df/volumes" Dec 05 20:42:13 crc kubenswrapper[4904]: I1205 20:42:13.697805 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e3a4d4-d72b-49db-92c9-214e39f84632" path="/var/lib/kubelet/pods/b6e3a4d4-d72b-49db-92c9-214e39f84632/volumes" Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.037372 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-g6b9b"] Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.047463 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dcc2-account-create-update-6b8b8"] Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.058701 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dcc2-account-create-update-6b8b8"] Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.069216 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-g6b9b"] Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.078218 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9154-account-create-update-cmzk9"] Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.086628 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9154-account-create-update-cmzk9"] Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.695994 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40e91b3-9613-4d07-9830-5ec0279bfae3" path="/var/lib/kubelet/pods/d40e91b3-9613-4d07-9830-5ec0279bfae3/volumes" Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.697132 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23caf25-58e4-47ac-aa1f-a49e5df687c2" path="/var/lib/kubelet/pods/f23caf25-58e4-47ac-aa1f-a49e5df687c2/volumes" Dec 05 20:42:15 crc kubenswrapper[4904]: I1205 20:42:15.697700 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc402d2-1cfe-4db2-a189-3f376e530162" path="/var/lib/kubelet/pods/fcc402d2-1cfe-4db2-a189-3f376e530162/volumes" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.579516 4904 scope.go:117] "RemoveContainer" containerID="14bf7a210112021292eda1a3b7e95d9a478e35bfab32dcb8b7d7695ebdfb3e88" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.618542 4904 scope.go:117] "RemoveContainer" containerID="34be13a802bce6e27c100c3085b8c02a27e009f51795636b75fdc8a933e5113e" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.677960 4904 scope.go:117] "RemoveContainer" containerID="74c92f7975ed00a4276d5fb9fa6535e661fa9f9e875ee6fc1892a0ff240b406c" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.726635 4904 scope.go:117] "RemoveContainer" containerID="45576a418e493916e0c8affdd647f8bc9548341ad9c5f68a7ec2631889949e6d" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.779089 4904 scope.go:117] "RemoveContainer" containerID="ebf0c792ade215724a97fd3925d07c9598ca6c0ff5d45f9a735449411bd4811f" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.821925 4904 scope.go:117] "RemoveContainer" containerID="4a36c3a7b160a8b5550aed14e2a223314e9fb58c9017b32feabd2fad3ebbc524" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.873650 4904 scope.go:117] "RemoveContainer" containerID="4ba2d394450df831d2dfaf242c743a8a1c320f6800383242e9a5c000be4e77a1" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.897449 4904 scope.go:117] "RemoveContainer" containerID="c58b4b306172ccd251ea711a6b0ed7c9e1e33fd1fe64512a2b185fae9f1733f4" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.918243 4904 scope.go:117] "RemoveContainer" containerID="18b66a1404cd78a9d1ffb1091d77c00d4e63a4385bf007a23b25c8661cf9e341" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.942838 4904 scope.go:117] "RemoveContainer" containerID="064a0fffb699f793417101d84460fa4df252d56d9f87f67532267d3d0adbade9" Dec 05 20:42:17 crc kubenswrapper[4904]: I1205 20:42:17.966233 4904 scope.go:117] "RemoveContainer" containerID="bfd86e47d10357a634ac3b5ae1cdb7655dbad01965dfd1a6d945f1b97e4c27e2" Dec 05 20:42:18 crc kubenswrapper[4904]: I1205 20:42:18.006416 4904 scope.go:117] "RemoveContainer" containerID="5c93fb6698dc26948b69efc50059e9b2b946fec830fbe9895bb1185c5bcc06ba" Dec 05 20:42:18 crc kubenswrapper[4904]: I1205 20:42:18.057721 4904 scope.go:117] "RemoveContainer" containerID="7c29d4eeb5a10031257ed9b236349bc927411c71caa97fc8e0a4f7875f7ef270" Dec 05 20:42:18 crc kubenswrapper[4904]: I1205 20:42:18.094829 4904 scope.go:117] "RemoveContainer" containerID="9111b5913060486a686f0e686fb8e869fc80f8121c854a0c00b4d9dd70c0de4c" Dec 05 20:42:20 crc kubenswrapper[4904]: I1205 20:42:20.027536 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-dkcsq"] Dec 05 20:42:20 crc kubenswrapper[4904]: I1205 20:42:20.060398 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-dkcsq"] Dec 05 20:42:21 crc kubenswrapper[4904]: I1205 20:42:21.032715 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6474s"] Dec 05 20:42:21 crc kubenswrapper[4904]: I1205 20:42:21.041929 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6474s"] Dec 05 20:42:21 crc kubenswrapper[4904]: I1205 20:42:21.692625 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c8de0c-093e-49ca-b80a-cb990b546a50" path="/var/lib/kubelet/pods/73c8de0c-093e-49ca-b80a-cb990b546a50/volumes" Dec 05 20:42:21 crc kubenswrapper[4904]: I1205 20:42:21.693417 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e323a682-f130-41c5-b97b-b7cd6ab4aecf" path="/var/lib/kubelet/pods/e323a682-f130-41c5-b97b-b7cd6ab4aecf/volumes" Dec 05 20:42:23 crc kubenswrapper[4904]: I1205 20:42:23.682359 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:42:23 crc kubenswrapper[4904]: E1205 20:42:23.682627 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:42:37 crc kubenswrapper[4904]: I1205 20:42:37.681624 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:42:37 crc kubenswrapper[4904]: E1205 20:42:37.682375 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:42:50 crc kubenswrapper[4904]: I1205 20:42:50.681978 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:42:50 crc kubenswrapper[4904]: E1205 20:42:50.683176 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:42:58 crc kubenswrapper[4904]: I1205 20:42:58.670032 4904 generic.go:334] "Generic (PLEG): container finished" podID="2b8cd007-58c8-4fd2-924d-a8d24608ff6c" containerID="f556567b4766f4e5d1078731b261a4839f89f8eebf37559fec9aae38aafdfa03" exitCode=0 Dec 05 20:42:58 crc kubenswrapper[4904]: I1205 20:42:58.670207 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" event={"ID":"2b8cd007-58c8-4fd2-924d-a8d24608ff6c","Type":"ContainerDied","Data":"f556567b4766f4e5d1078731b261a4839f89f8eebf37559fec9aae38aafdfa03"} Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.105796 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.250429 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-inventory\") pod \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.250545 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dwkd\" (UniqueName: \"kubernetes.io/projected/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-kube-api-access-4dwkd\") pod \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.250596 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-ssh-key\") pod \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\" (UID: \"2b8cd007-58c8-4fd2-924d-a8d24608ff6c\") " Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.255796 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-kube-api-access-4dwkd" (OuterVolumeSpecName: "kube-api-access-4dwkd") pod "2b8cd007-58c8-4fd2-924d-a8d24608ff6c" (UID: "2b8cd007-58c8-4fd2-924d-a8d24608ff6c"). InnerVolumeSpecName "kube-api-access-4dwkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.280619 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-inventory" (OuterVolumeSpecName: "inventory") pod "2b8cd007-58c8-4fd2-924d-a8d24608ff6c" (UID: "2b8cd007-58c8-4fd2-924d-a8d24608ff6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.282891 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b8cd007-58c8-4fd2-924d-a8d24608ff6c" (UID: "2b8cd007-58c8-4fd2-924d-a8d24608ff6c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.352483 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.352542 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dwkd\" (UniqueName: \"kubernetes.io/projected/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-kube-api-access-4dwkd\") on node \"crc\" DevicePath \"\"" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.352554 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b8cd007-58c8-4fd2-924d-a8d24608ff6c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.693038 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" event={"ID":"2b8cd007-58c8-4fd2-924d-a8d24608ff6c","Type":"ContainerDied","Data":"3323e767b54dc0e8746a206b4fb2adbcff61a78b1b591201220c618b428be536"} Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.693599 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3323e767b54dc0e8746a206b4fb2adbcff61a78b1b591201220c618b428be536" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.693163 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.779250 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx"] Dec 05 20:43:00 crc kubenswrapper[4904]: E1205 20:43:00.779809 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8cd007-58c8-4fd2-924d-a8d24608ff6c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.779825 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8cd007-58c8-4fd2-924d-a8d24608ff6c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.780050 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8cd007-58c8-4fd2-924d-a8d24608ff6c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.780763 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.783796 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.783981 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.783852 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.783900 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.793768 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx"] Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.861520 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.861585 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhz7\" (UniqueName: \"kubernetes.io/projected/676e3b5b-34d1-47bc-a1db-3bb15a83282b-kube-api-access-ndhz7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.861742 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.963964 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.964044 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhz7\" (UniqueName: \"kubernetes.io/projected/676e3b5b-34d1-47bc-a1db-3bb15a83282b-kube-api-access-ndhz7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.964114 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.969196 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.971799 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:00 crc kubenswrapper[4904]: I1205 20:43:00.992760 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhz7\" (UniqueName: \"kubernetes.io/projected/676e3b5b-34d1-47bc-a1db-3bb15a83282b-kube-api-access-ndhz7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:01 crc kubenswrapper[4904]: I1205 20:43:01.119351 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:43:01 crc kubenswrapper[4904]: I1205 20:43:01.738587 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx"] Dec 05 20:43:01 crc kubenswrapper[4904]: W1205 20:43:01.741147 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676e3b5b_34d1_47bc_a1db_3bb15a83282b.slice/crio-961c32f7dccabd10b7c5aa033354cd7905ab6692bd01d5368852a12ad32d0d8b WatchSource:0}: Error finding container 961c32f7dccabd10b7c5aa033354cd7905ab6692bd01d5368852a12ad32d0d8b: Status 404 returned error can't find the container with id 961c32f7dccabd10b7c5aa033354cd7905ab6692bd01d5368852a12ad32d0d8b Dec 05 20:43:02 crc kubenswrapper[4904]: I1205 20:43:02.709372 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" event={"ID":"676e3b5b-34d1-47bc-a1db-3bb15a83282b","Type":"ContainerStarted","Data":"5ea2f88f221eb477a4f92871fe3edbdd781c1c0cb39cc93b199d093b8ff99541"} Dec 05 20:43:02 crc kubenswrapper[4904]: I1205 20:43:02.709949 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" event={"ID":"676e3b5b-34d1-47bc-a1db-3bb15a83282b","Type":"ContainerStarted","Data":"961c32f7dccabd10b7c5aa033354cd7905ab6692bd01d5368852a12ad32d0d8b"} Dec 05 20:43:02 crc kubenswrapper[4904]: I1205 20:43:02.739967 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" podStartSLOduration=2.339743544 podStartE2EDuration="2.739915329s" podCreationTimestamp="2025-12-05 20:43:00 +0000 UTC" firstStartedPulling="2025-12-05 20:43:01.743614848 +0000 UTC m=+1880.554830957" lastFinishedPulling="2025-12-05 20:43:02.143786603 +0000 UTC m=+1880.955002742" observedRunningTime="2025-12-05 20:43:02.727310691 +0000 UTC m=+1881.538526810" watchObservedRunningTime="2025-12-05 20:43:02.739915329 +0000 UTC m=+1881.551131448" Dec 05 20:43:04 crc kubenswrapper[4904]: I1205 20:43:04.682312 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:43:04 crc kubenswrapper[4904]: E1205 20:43:04.683237 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:43:07 crc kubenswrapper[4904]: I1205 20:43:07.052040 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pjlzn"] Dec 05 20:43:07 crc kubenswrapper[4904]: I1205 20:43:07.065218 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pjlzn"] Dec 05 20:43:07 crc kubenswrapper[4904]: I1205 20:43:07.700345 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76209519-a745-4eab-9d5d-c330ffb29191" path="/var/lib/kubelet/pods/76209519-a745-4eab-9d5d-c330ffb29191/volumes" Dec 05 20:43:18 crc kubenswrapper[4904]: I1205 20:43:18.405340 4904 scope.go:117] "RemoveContainer" containerID="fc7821d1b2305b566cab8bfc36fd3d6f720db61bdbdb4f4b42ac6e8dc25c2159" Dec 05 20:43:18 crc kubenswrapper[4904]: I1205 20:43:18.449588 4904 scope.go:117] "RemoveContainer" containerID="3d5bca8cafc334fa0ef45643833874577b34af2e5400587d21e0a38e326bc937" Dec 05 20:43:18 crc kubenswrapper[4904]: I1205 20:43:18.497203 4904 scope.go:117] "RemoveContainer" containerID="25a406ebf7dbf4413fbcbc4dbbf6abc7b563f1092458be1a59a7f5bafa31ac27" Dec 05 20:43:18 crc kubenswrapper[4904]: I1205 20:43:18.681882 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:43:18 crc kubenswrapper[4904]: E1205 20:43:18.682367 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:43:21 crc kubenswrapper[4904]: I1205 20:43:21.054048 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4zhc6"] Dec 05 20:43:21 crc kubenswrapper[4904]: I1205 20:43:21.074372 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4zhc6"] Dec 05 20:43:21 crc kubenswrapper[4904]: I1205 20:43:21.082871 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ft95f"] Dec 05 20:43:21 crc kubenswrapper[4904]: I1205 20:43:21.090435 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ft95f"] Dec 05 20:43:21 crc kubenswrapper[4904]: I1205 20:43:21.702378 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8" path="/var/lib/kubelet/pods/2c7facdc-9b2e-435b-a1bc-f57d0fb2b5a8/volumes" Dec 05 20:43:21 crc kubenswrapper[4904]: I1205 20:43:21.703871 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e046612d-8016-4321-8a7a-a14b14f68e91" path="/var/lib/kubelet/pods/e046612d-8016-4321-8a7a-a14b14f68e91/volumes" Dec 05 20:43:23 crc kubenswrapper[4904]: I1205 20:43:23.040607 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wh6jz"] Dec 05 20:43:23 crc kubenswrapper[4904]: I1205 20:43:23.053795 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wh6jz"] Dec 05 20:43:23 crc kubenswrapper[4904]: I1205 20:43:23.694471 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0a41473-f3de-440f-89be-9fddf77f6148" path="/var/lib/kubelet/pods/f0a41473-f3de-440f-89be-9fddf77f6148/volumes" Dec 05 20:43:33 crc kubenswrapper[4904]: I1205 20:43:33.681864 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:43:33 crc kubenswrapper[4904]: E1205 20:43:33.682619 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:43:38 crc kubenswrapper[4904]: I1205 20:43:38.054920 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-k6l4d"] Dec 05 20:43:38 crc kubenswrapper[4904]: I1205 20:43:38.067728 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-k6l4d"] Dec 05 20:43:39 crc kubenswrapper[4904]: I1205 20:43:39.692814 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b72d1aa8-3933-4153-89ac-a4ffe0667268" path="/var/lib/kubelet/pods/b72d1aa8-3933-4153-89ac-a4ffe0667268/volumes" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.493249 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x8zhg"] Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.505967 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.541017 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8zhg"] Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.682532 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:43:48 crc kubenswrapper[4904]: E1205 20:43:48.682779 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.682908 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-utilities\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.683226 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-catalog-content\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.683262 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65jp\" (UniqueName: \"kubernetes.io/projected/25e1e96e-be76-42b3-8a7a-e091cd3173e5-kube-api-access-h65jp\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.784490 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-utilities\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.784594 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-catalog-content\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.784617 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65jp\" (UniqueName: \"kubernetes.io/projected/25e1e96e-be76-42b3-8a7a-e091cd3173e5-kube-api-access-h65jp\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.786326 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-catalog-content\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.786507 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-utilities\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.815265 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65jp\" (UniqueName: \"kubernetes.io/projected/25e1e96e-be76-42b3-8a7a-e091cd3173e5-kube-api-access-h65jp\") pod \"redhat-marketplace-x8zhg\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:48 crc kubenswrapper[4904]: I1205 20:43:48.849076 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.376123 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8zhg"] Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.505162 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bh9kz"] Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.508433 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.519955 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bh9kz"] Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.607851 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-catalog-content\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.607917 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbhp\" (UniqueName: \"kubernetes.io/projected/384a54d7-4a0f-4af9-b250-07f6302c4d39-kube-api-access-rgbhp\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.607992 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-utilities\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.710375 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbhp\" (UniqueName: \"kubernetes.io/projected/384a54d7-4a0f-4af9-b250-07f6302c4d39-kube-api-access-rgbhp\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.710486 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-utilities\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.710706 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-catalog-content\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.711271 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-utilities\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.711311 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-catalog-content\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.731548 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbhp\" (UniqueName: \"kubernetes.io/projected/384a54d7-4a0f-4af9-b250-07f6302c4d39-kube-api-access-rgbhp\") pod \"certified-operators-bh9kz\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:49 crc kubenswrapper[4904]: I1205 20:43:49.853170 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.055131 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gtfpr"] Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.064647 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gtfpr"] Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.200953 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bh9kz"] Dec 05 20:43:50 crc kubenswrapper[4904]: W1205 20:43:50.202147 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod384a54d7_4a0f_4af9_b250_07f6302c4d39.slice/crio-d0b3dd3ab0be28087850f0dd768e6f1d2c7c453620538965829d86c998f39409 WatchSource:0}: Error finding container d0b3dd3ab0be28087850f0dd768e6f1d2c7c453620538965829d86c998f39409: Status 404 returned error can't find the container with id d0b3dd3ab0be28087850f0dd768e6f1d2c7c453620538965829d86c998f39409 Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.290686 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerStarted","Data":"d0b3dd3ab0be28087850f0dd768e6f1d2c7c453620538965829d86c998f39409"} Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.298910 4904 generic.go:334] "Generic (PLEG): container finished" podID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerID="b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066" exitCode=0 Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.298964 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8zhg" event={"ID":"25e1e96e-be76-42b3-8a7a-e091cd3173e5","Type":"ContainerDied","Data":"b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066"} Dec 05 20:43:50 crc kubenswrapper[4904]: I1205 20:43:50.299071 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8zhg" event={"ID":"25e1e96e-be76-42b3-8a7a-e091cd3173e5","Type":"ContainerStarted","Data":"9a06c7f34b86114e1de9ad886398e708058f8df5b0c0c2b58ef10f5a8243ab58"} Dec 05 20:43:51 crc kubenswrapper[4904]: I1205 20:43:51.311130 4904 generic.go:334] "Generic (PLEG): container finished" podID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerID="ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4" exitCode=0 Dec 05 20:43:51 crc kubenswrapper[4904]: I1205 20:43:51.311188 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerDied","Data":"ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4"} Dec 05 20:43:51 crc kubenswrapper[4904]: I1205 20:43:51.693299 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b" path="/var/lib/kubelet/pods/5c9a0e3b-f8c9-49d6-8bd3-d22c256c2e4b/volumes" Dec 05 20:43:52 crc kubenswrapper[4904]: I1205 20:43:52.322666 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerStarted","Data":"b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80"} Dec 05 20:43:52 crc kubenswrapper[4904]: I1205 20:43:52.324882 4904 generic.go:334] "Generic (PLEG): container finished" podID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerID="7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e" exitCode=0 Dec 05 20:43:52 crc kubenswrapper[4904]: I1205 20:43:52.324926 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8zhg" event={"ID":"25e1e96e-be76-42b3-8a7a-e091cd3173e5","Type":"ContainerDied","Data":"7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e"} Dec 05 20:43:53 crc kubenswrapper[4904]: I1205 20:43:53.335417 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8zhg" event={"ID":"25e1e96e-be76-42b3-8a7a-e091cd3173e5","Type":"ContainerStarted","Data":"144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667"} Dec 05 20:43:53 crc kubenswrapper[4904]: I1205 20:43:53.337969 4904 generic.go:334] "Generic (PLEG): container finished" podID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerID="b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80" exitCode=0 Dec 05 20:43:53 crc kubenswrapper[4904]: I1205 20:43:53.338009 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerDied","Data":"b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80"} Dec 05 20:43:53 crc kubenswrapper[4904]: I1205 20:43:53.366207 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x8zhg" podStartSLOduration=2.9313194559999998 podStartE2EDuration="5.366178544s" podCreationTimestamp="2025-12-05 20:43:48 +0000 UTC" firstStartedPulling="2025-12-05 20:43:50.301263159 +0000 UTC m=+1929.112479268" lastFinishedPulling="2025-12-05 20:43:52.736122237 +0000 UTC m=+1931.547338356" observedRunningTime="2025-12-05 20:43:53.356725386 +0000 UTC m=+1932.167941515" watchObservedRunningTime="2025-12-05 20:43:53.366178544 +0000 UTC m=+1932.177394653" Dec 05 20:43:57 crc kubenswrapper[4904]: I1205 20:43:57.374619 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerStarted","Data":"fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae"} Dec 05 20:43:57 crc kubenswrapper[4904]: I1205 20:43:57.392401 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bh9kz" podStartSLOduration=3.939280992 podStartE2EDuration="8.392373028s" podCreationTimestamp="2025-12-05 20:43:49 +0000 UTC" firstStartedPulling="2025-12-05 20:43:51.313294382 +0000 UTC m=+1930.124510491" lastFinishedPulling="2025-12-05 20:43:55.766386418 +0000 UTC m=+1934.577602527" observedRunningTime="2025-12-05 20:43:57.391611927 +0000 UTC m=+1936.202828046" watchObservedRunningTime="2025-12-05 20:43:57.392373028 +0000 UTC m=+1936.203589137" Dec 05 20:43:58 crc kubenswrapper[4904]: I1205 20:43:58.850043 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:58 crc kubenswrapper[4904]: I1205 20:43:58.850352 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:58 crc kubenswrapper[4904]: I1205 20:43:58.910703 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:59 crc kubenswrapper[4904]: I1205 20:43:59.443196 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:43:59 crc kubenswrapper[4904]: I1205 20:43:59.854985 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:59 crc kubenswrapper[4904]: I1205 20:43:59.855038 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:43:59 crc kubenswrapper[4904]: I1205 20:43:59.919726 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.285239 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8zhg"] Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.414817 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x8zhg" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="registry-server" containerID="cri-o://144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667" gracePeriod=2 Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.891595 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.976106 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-utilities\") pod \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.976167 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-catalog-content\") pod \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.976192 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h65jp\" (UniqueName: \"kubernetes.io/projected/25e1e96e-be76-42b3-8a7a-e091cd3173e5-kube-api-access-h65jp\") pod \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\" (UID: \"25e1e96e-be76-42b3-8a7a-e091cd3173e5\") " Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.977013 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-utilities" (OuterVolumeSpecName: "utilities") pod "25e1e96e-be76-42b3-8a7a-e091cd3173e5" (UID: "25e1e96e-be76-42b3-8a7a-e091cd3173e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.995836 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25e1e96e-be76-42b3-8a7a-e091cd3173e5" (UID: "25e1e96e-be76-42b3-8a7a-e091cd3173e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:44:01 crc kubenswrapper[4904]: I1205 20:44:01.995944 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e1e96e-be76-42b3-8a7a-e091cd3173e5-kube-api-access-h65jp" (OuterVolumeSpecName: "kube-api-access-h65jp") pod "25e1e96e-be76-42b3-8a7a-e091cd3173e5" (UID: "25e1e96e-be76-42b3-8a7a-e091cd3173e5"). InnerVolumeSpecName "kube-api-access-h65jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.078595 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.078626 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h65jp\" (UniqueName: \"kubernetes.io/projected/25e1e96e-be76-42b3-8a7a-e091cd3173e5-kube-api-access-h65jp\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.078635 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e1e96e-be76-42b3-8a7a-e091cd3173e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.425218 4904 generic.go:334] "Generic (PLEG): container finished" podID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerID="144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667" exitCode=0 Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.425269 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8zhg" event={"ID":"25e1e96e-be76-42b3-8a7a-e091cd3173e5","Type":"ContainerDied","Data":"144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667"} Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.425301 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8zhg" event={"ID":"25e1e96e-be76-42b3-8a7a-e091cd3173e5","Type":"ContainerDied","Data":"9a06c7f34b86114e1de9ad886398e708058f8df5b0c0c2b58ef10f5a8243ab58"} Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.425324 4904 scope.go:117] "RemoveContainer" containerID="144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.425375 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8zhg" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.451939 4904 scope.go:117] "RemoveContainer" containerID="7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.484488 4904 scope.go:117] "RemoveContainer" containerID="b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.484999 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8zhg"] Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.496201 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8zhg"] Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.528995 4904 scope.go:117] "RemoveContainer" containerID="144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667" Dec 05 20:44:02 crc kubenswrapper[4904]: E1205 20:44:02.529516 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667\": container with ID starting with 144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667 not found: ID does not exist" containerID="144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.529542 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667"} err="failed to get container status \"144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667\": rpc error: code = NotFound desc = could not find container \"144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667\": container with ID starting with 144b48acf4b7f1363dd13a69f3796b46d534d1e9b88bffda621ad966ac3e0667 not found: ID does not exist" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.529562 4904 scope.go:117] "RemoveContainer" containerID="7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e" Dec 05 20:44:02 crc kubenswrapper[4904]: E1205 20:44:02.529831 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e\": container with ID starting with 7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e not found: ID does not exist" containerID="7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.529890 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e"} err="failed to get container status \"7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e\": rpc error: code = NotFound desc = could not find container \"7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e\": container with ID starting with 7369a5cf2e284e19c674270a97c6c3439bb4ffce2ffb4a8a52bddce6c43a858e not found: ID does not exist" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.529904 4904 scope.go:117] "RemoveContainer" containerID="b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066" Dec 05 20:44:02 crc kubenswrapper[4904]: E1205 20:44:02.530215 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066\": container with ID starting with b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066 not found: ID does not exist" containerID="b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.530240 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066"} err="failed to get container status \"b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066\": rpc error: code = NotFound desc = could not find container \"b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066\": container with ID starting with b5ff5e45c9b6660c6e4a616dc924455099dd7b8f543e6972a31319bbc75bc066 not found: ID does not exist" Dec 05 20:44:02 crc kubenswrapper[4904]: I1205 20:44:02.682776 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:44:02 crc kubenswrapper[4904]: E1205 20:44:02.683181 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:44:03 crc kubenswrapper[4904]: I1205 20:44:03.703695 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" path="/var/lib/kubelet/pods/25e1e96e-be76-42b3-8a7a-e091cd3173e5/volumes" Dec 05 20:44:09 crc kubenswrapper[4904]: I1205 20:44:09.930162 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:44:09 crc kubenswrapper[4904]: I1205 20:44:09.974939 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bh9kz"] Dec 05 20:44:10 crc kubenswrapper[4904]: I1205 20:44:10.517833 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bh9kz" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="registry-server" containerID="cri-o://fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae" gracePeriod=2 Dec 05 20:44:10 crc kubenswrapper[4904]: I1205 20:44:10.985947 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.051122 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-utilities\") pod \"384a54d7-4a0f-4af9-b250-07f6302c4d39\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.051254 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgbhp\" (UniqueName: \"kubernetes.io/projected/384a54d7-4a0f-4af9-b250-07f6302c4d39-kube-api-access-rgbhp\") pod \"384a54d7-4a0f-4af9-b250-07f6302c4d39\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.051325 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-catalog-content\") pod \"384a54d7-4a0f-4af9-b250-07f6302c4d39\" (UID: \"384a54d7-4a0f-4af9-b250-07f6302c4d39\") " Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.052184 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-utilities" (OuterVolumeSpecName: "utilities") pod "384a54d7-4a0f-4af9-b250-07f6302c4d39" (UID: "384a54d7-4a0f-4af9-b250-07f6302c4d39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.060282 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384a54d7-4a0f-4af9-b250-07f6302c4d39-kube-api-access-rgbhp" (OuterVolumeSpecName: "kube-api-access-rgbhp") pod "384a54d7-4a0f-4af9-b250-07f6302c4d39" (UID: "384a54d7-4a0f-4af9-b250-07f6302c4d39"). InnerVolumeSpecName "kube-api-access-rgbhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.097596 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "384a54d7-4a0f-4af9-b250-07f6302c4d39" (UID: "384a54d7-4a0f-4af9-b250-07f6302c4d39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.153604 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.153639 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgbhp\" (UniqueName: \"kubernetes.io/projected/384a54d7-4a0f-4af9-b250-07f6302c4d39-kube-api-access-rgbhp\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.153652 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a54d7-4a0f-4af9-b250-07f6302c4d39-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.531776 4904 generic.go:334] "Generic (PLEG): container finished" podID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerID="fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae" exitCode=0 Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.531855 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerDied","Data":"fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae"} Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.531899 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh9kz" event={"ID":"384a54d7-4a0f-4af9-b250-07f6302c4d39","Type":"ContainerDied","Data":"d0b3dd3ab0be28087850f0dd768e6f1d2c7c453620538965829d86c998f39409"} Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.531930 4904 scope.go:117] "RemoveContainer" containerID="fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.532189 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh9kz" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.562515 4904 scope.go:117] "RemoveContainer" containerID="b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.579607 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bh9kz"] Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.594499 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bh9kz"] Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.604536 4904 scope.go:117] "RemoveContainer" containerID="ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.662812 4904 scope.go:117] "RemoveContainer" containerID="fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae" Dec 05 20:44:11 crc kubenswrapper[4904]: E1205 20:44:11.663463 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae\": container with ID starting with fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae not found: ID does not exist" containerID="fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.663610 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae"} err="failed to get container status \"fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae\": rpc error: code = NotFound desc = could not find container \"fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae\": container with ID starting with fe89727c790d3d4dc7cc50c6117e93e4bee04d3be796bd1218afc6af686421ae not found: ID does not exist" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.663959 4904 scope.go:117] "RemoveContainer" containerID="b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80" Dec 05 20:44:11 crc kubenswrapper[4904]: E1205 20:44:11.664400 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80\": container with ID starting with b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80 not found: ID does not exist" containerID="b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.664426 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80"} err="failed to get container status \"b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80\": rpc error: code = NotFound desc = could not find container \"b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80\": container with ID starting with b3db1c4b756e772c8e50dbe0336a10633c83f7bf65193c363fec9447d605bd80 not found: ID does not exist" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.664440 4904 scope.go:117] "RemoveContainer" containerID="ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4" Dec 05 20:44:11 crc kubenswrapper[4904]: E1205 20:44:11.664712 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4\": container with ID starting with ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4 not found: ID does not exist" containerID="ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.664802 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4"} err="failed to get container status \"ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4\": rpc error: code = NotFound desc = could not find container \"ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4\": container with ID starting with ceb938a5c2c76621d5d2a6a84cb6fe6e5694d4d179ccc8a7504e9035f791b4f4 not found: ID does not exist" Dec 05 20:44:11 crc kubenswrapper[4904]: I1205 20:44:11.692483 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" path="/var/lib/kubelet/pods/384a54d7-4a0f-4af9-b250-07f6302c4d39/volumes" Dec 05 20:44:13 crc kubenswrapper[4904]: I1205 20:44:13.681813 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:44:13 crc kubenswrapper[4904]: E1205 20:44:13.682370 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:44:18 crc kubenswrapper[4904]: I1205 20:44:18.643425 4904 scope.go:117] "RemoveContainer" containerID="6a062818e034002eee32678ff30187d7a38d293c79de2a5c2b33c719e5f5af70" Dec 05 20:44:18 crc kubenswrapper[4904]: I1205 20:44:18.686963 4904 scope.go:117] "RemoveContainer" containerID="588bb01d56b63e882f961e9f854e18ef40bec7cc7f939dfb4704961b67fcc6f5" Dec 05 20:44:18 crc kubenswrapper[4904]: I1205 20:44:18.736454 4904 scope.go:117] "RemoveContainer" containerID="39156239472b92dabca22f1a6012db800be8051e59e0bc036717ecd11904ceab" Dec 05 20:44:18 crc kubenswrapper[4904]: I1205 20:44:18.788209 4904 scope.go:117] "RemoveContainer" containerID="ba7b4df22e89a35b4c6557fd502623f1c46ac646b4218b8e0d233117e64ff38e" Dec 05 20:44:18 crc kubenswrapper[4904]: I1205 20:44:18.829420 4904 scope.go:117] "RemoveContainer" containerID="dc978a68af7766b44df471d6f7e4d8f475abe5d11585e4c2d2e4b6ab1839a094" Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.064119 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mq95x"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.076490 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8dd6-account-create-update-t2wb6"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.083928 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-m5z4d"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.091981 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8ea0-account-create-update-rvd2d"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.100625 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8dd6-account-create-update-t2wb6"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.108687 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-m5z4d"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.116048 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mq95x"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.126252 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8ea0-account-create-update-rvd2d"] Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.634197 4904 generic.go:334] "Generic (PLEG): container finished" podID="676e3b5b-34d1-47bc-a1db-3bb15a83282b" containerID="5ea2f88f221eb477a4f92871fe3edbdd781c1c0cb39cc93b199d093b8ff99541" exitCode=0 Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.634245 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" event={"ID":"676e3b5b-34d1-47bc-a1db-3bb15a83282b","Type":"ContainerDied","Data":"5ea2f88f221eb477a4f92871fe3edbdd781c1c0cb39cc93b199d093b8ff99541"} Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.695245 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097e7395-bc3a-4034-bc78-bc3d86757a70" path="/var/lib/kubelet/pods/097e7395-bc3a-4034-bc78-bc3d86757a70/volumes" Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.697102 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21e14cd-7505-4d92-8de4-a983351c6d9a" path="/var/lib/kubelet/pods/a21e14cd-7505-4d92-8de4-a983351c6d9a/volumes" Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.697833 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be427ec6-7dd9-4285-b52c-3a797793ca88" path="/var/lib/kubelet/pods/be427ec6-7dd9-4285-b52c-3a797793ca88/volumes" Dec 05 20:44:21 crc kubenswrapper[4904]: I1205 20:44:21.698605 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf88911a-3e40-4dbe-9cab-11ac4077b33b" path="/var/lib/kubelet/pods/bf88911a-3e40-4dbe-9cab-11ac4077b33b/volumes" Dec 05 20:44:22 crc kubenswrapper[4904]: I1205 20:44:22.045459 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fb4d-account-create-update-fzf7v"] Dec 05 20:44:22 crc kubenswrapper[4904]: I1205 20:44:22.060050 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-z45b2"] Dec 05 20:44:22 crc kubenswrapper[4904]: I1205 20:44:22.072441 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-z45b2"] Dec 05 20:44:22 crc kubenswrapper[4904]: I1205 20:44:22.085958 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fb4d-account-create-update-fzf7v"] Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.096988 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.298020 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-ssh-key\") pod \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.298158 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-inventory\") pod \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.298212 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndhz7\" (UniqueName: \"kubernetes.io/projected/676e3b5b-34d1-47bc-a1db-3bb15a83282b-kube-api-access-ndhz7\") pod \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\" (UID: \"676e3b5b-34d1-47bc-a1db-3bb15a83282b\") " Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.307865 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676e3b5b-34d1-47bc-a1db-3bb15a83282b-kube-api-access-ndhz7" (OuterVolumeSpecName: "kube-api-access-ndhz7") pod "676e3b5b-34d1-47bc-a1db-3bb15a83282b" (UID: "676e3b5b-34d1-47bc-a1db-3bb15a83282b"). InnerVolumeSpecName "kube-api-access-ndhz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.326531 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-inventory" (OuterVolumeSpecName: "inventory") pod "676e3b5b-34d1-47bc-a1db-3bb15a83282b" (UID: "676e3b5b-34d1-47bc-a1db-3bb15a83282b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.330001 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "676e3b5b-34d1-47bc-a1db-3bb15a83282b" (UID: "676e3b5b-34d1-47bc-a1db-3bb15a83282b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.401159 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.401219 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/676e3b5b-34d1-47bc-a1db-3bb15a83282b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.401239 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndhz7\" (UniqueName: \"kubernetes.io/projected/676e3b5b-34d1-47bc-a1db-3bb15a83282b-kube-api-access-ndhz7\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.661335 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" event={"ID":"676e3b5b-34d1-47bc-a1db-3bb15a83282b","Type":"ContainerDied","Data":"961c32f7dccabd10b7c5aa033354cd7905ab6692bd01d5368852a12ad32d0d8b"} Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.661394 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.661411 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="961c32f7dccabd10b7c5aa033354cd7905ab6692bd01d5368852a12ad32d0d8b" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.707327 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605b48e0-13b6-47bb-9486-7c29c947e915" path="/var/lib/kubelet/pods/605b48e0-13b6-47bb-9486-7c29c947e915/volumes" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.708489 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfa100a-6ac4-441e-a2d8-78384458fd67" path="/var/lib/kubelet/pods/ebfa100a-6ac4-441e-a2d8-78384458fd67/volumes" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.776573 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq"] Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777185 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="registry-server" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777211 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="registry-server" Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777232 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="extract-utilities" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777242 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="extract-utilities" Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777262 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="extract-content" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777273 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="extract-content" Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777306 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676e3b5b-34d1-47bc-a1db-3bb15a83282b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777320 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="676e3b5b-34d1-47bc-a1db-3bb15a83282b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777347 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="extract-utilities" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777356 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="extract-utilities" Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777375 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="registry-server" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777385 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="registry-server" Dec 05 20:44:23 crc kubenswrapper[4904]: E1205 20:44:23.777405 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="extract-content" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777413 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="extract-content" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777671 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e1e96e-be76-42b3-8a7a-e091cd3173e5" containerName="registry-server" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777694 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="384a54d7-4a0f-4af9-b250-07f6302c4d39" containerName="registry-server" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.777721 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="676e3b5b-34d1-47bc-a1db-3bb15a83282b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.778655 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.783933 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.784321 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.784533 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.784717 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.806264 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq"] Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.915284 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.915373 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfqb9\" (UniqueName: \"kubernetes.io/projected/192437e7-c4fd-4142-94fb-e3f2a9c75841-kube-api-access-lfqb9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:23 crc kubenswrapper[4904]: I1205 20:44:23.915433 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.017802 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.017861 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfqb9\" (UniqueName: \"kubernetes.io/projected/192437e7-c4fd-4142-94fb-e3f2a9c75841-kube-api-access-lfqb9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.017920 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.043212 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfqb9\" (UniqueName: \"kubernetes.io/projected/192437e7-c4fd-4142-94fb-e3f2a9c75841-kube-api-access-lfqb9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.043213 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.043912 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.106755 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:24 crc kubenswrapper[4904]: I1205 20:44:24.672103 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq"] Dec 05 20:44:25 crc kubenswrapper[4904]: I1205 20:44:25.681261 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:44:25 crc kubenswrapper[4904]: E1205 20:44:25.681955 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:44:25 crc kubenswrapper[4904]: I1205 20:44:25.701538 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" podStartSLOduration=2.135200134 podStartE2EDuration="2.701513963s" podCreationTimestamp="2025-12-05 20:44:23 +0000 UTC" firstStartedPulling="2025-12-05 20:44:24.681162004 +0000 UTC m=+1963.492378113" lastFinishedPulling="2025-12-05 20:44:25.247475833 +0000 UTC m=+1964.058691942" observedRunningTime="2025-12-05 20:44:25.700171476 +0000 UTC m=+1964.511387605" watchObservedRunningTime="2025-12-05 20:44:25.701513963 +0000 UTC m=+1964.512730072" Dec 05 20:44:25 crc kubenswrapper[4904]: I1205 20:44:25.703211 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" event={"ID":"192437e7-c4fd-4142-94fb-e3f2a9c75841","Type":"ContainerStarted","Data":"038fe80fa348830cb319804b1176fa36e87dee78d661d351be9d4c008e548dfc"} Dec 05 20:44:25 crc kubenswrapper[4904]: I1205 20:44:25.703259 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" event={"ID":"192437e7-c4fd-4142-94fb-e3f2a9c75841","Type":"ContainerStarted","Data":"c86f5909780e8b86ee23599ef7479ed4dae40df6f9769412537537d725510a0a"} Dec 05 20:44:30 crc kubenswrapper[4904]: I1205 20:44:30.737429 4904 generic.go:334] "Generic (PLEG): container finished" podID="192437e7-c4fd-4142-94fb-e3f2a9c75841" containerID="038fe80fa348830cb319804b1176fa36e87dee78d661d351be9d4c008e548dfc" exitCode=0 Dec 05 20:44:30 crc kubenswrapper[4904]: I1205 20:44:30.737625 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" event={"ID":"192437e7-c4fd-4142-94fb-e3f2a9c75841","Type":"ContainerDied","Data":"038fe80fa348830cb319804b1176fa36e87dee78d661d351be9d4c008e548dfc"} Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.231481 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.394195 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-inventory\") pod \"192437e7-c4fd-4142-94fb-e3f2a9c75841\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.394367 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-ssh-key\") pod \"192437e7-c4fd-4142-94fb-e3f2a9c75841\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.394790 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfqb9\" (UniqueName: \"kubernetes.io/projected/192437e7-c4fd-4142-94fb-e3f2a9c75841-kube-api-access-lfqb9\") pod \"192437e7-c4fd-4142-94fb-e3f2a9c75841\" (UID: \"192437e7-c4fd-4142-94fb-e3f2a9c75841\") " Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.399572 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192437e7-c4fd-4142-94fb-e3f2a9c75841-kube-api-access-lfqb9" (OuterVolumeSpecName: "kube-api-access-lfqb9") pod "192437e7-c4fd-4142-94fb-e3f2a9c75841" (UID: "192437e7-c4fd-4142-94fb-e3f2a9c75841"). InnerVolumeSpecName "kube-api-access-lfqb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.427087 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "192437e7-c4fd-4142-94fb-e3f2a9c75841" (UID: "192437e7-c4fd-4142-94fb-e3f2a9c75841"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.427687 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-inventory" (OuterVolumeSpecName: "inventory") pod "192437e7-c4fd-4142-94fb-e3f2a9c75841" (UID: "192437e7-c4fd-4142-94fb-e3f2a9c75841"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.497279 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.497306 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/192437e7-c4fd-4142-94fb-e3f2a9c75841-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.497315 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfqb9\" (UniqueName: \"kubernetes.io/projected/192437e7-c4fd-4142-94fb-e3f2a9c75841-kube-api-access-lfqb9\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.762540 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" event={"ID":"192437e7-c4fd-4142-94fb-e3f2a9c75841","Type":"ContainerDied","Data":"c86f5909780e8b86ee23599ef7479ed4dae40df6f9769412537537d725510a0a"} Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.762595 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86f5909780e8b86ee23599ef7479ed4dae40df6f9769412537537d725510a0a" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.762613 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.890229 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd"] Dec 05 20:44:32 crc kubenswrapper[4904]: E1205 20:44:32.890748 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192437e7-c4fd-4142-94fb-e3f2a9c75841" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.890770 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="192437e7-c4fd-4142-94fb-e3f2a9c75841" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.891077 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="192437e7-c4fd-4142-94fb-e3f2a9c75841" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.891839 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.895944 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.896126 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.896358 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.896542 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:44:32 crc kubenswrapper[4904]: I1205 20:44:32.905357 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd"] Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.008613 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.008967 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.009022 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrq7q\" (UniqueName: \"kubernetes.io/projected/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-kube-api-access-xrq7q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.111479 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.111636 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.111765 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrq7q\" (UniqueName: \"kubernetes.io/projected/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-kube-api-access-xrq7q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.117152 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.117846 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.130887 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrq7q\" (UniqueName: \"kubernetes.io/projected/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-kube-api-access-xrq7q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ww8dd\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.216871 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:44:33 crc kubenswrapper[4904]: I1205 20:44:33.803305 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd"] Dec 05 20:44:34 crc kubenswrapper[4904]: I1205 20:44:34.789435 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" event={"ID":"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9","Type":"ContainerStarted","Data":"c4821a608051942f6d95c4417fe1e5e5d04b8ff9646da2fae5103e2287cd6827"} Dec 05 20:44:34 crc kubenswrapper[4904]: I1205 20:44:34.789855 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" event={"ID":"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9","Type":"ContainerStarted","Data":"7837e11d94a38f2712699673111bb7503af2de4631fc2408eba7af7517e85cb6"} Dec 05 20:44:34 crc kubenswrapper[4904]: I1205 20:44:34.827925 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" podStartSLOduration=2.220358771 podStartE2EDuration="2.827898283s" podCreationTimestamp="2025-12-05 20:44:32 +0000 UTC" firstStartedPulling="2025-12-05 20:44:33.811357767 +0000 UTC m=+1972.622573876" lastFinishedPulling="2025-12-05 20:44:34.418897289 +0000 UTC m=+1973.230113388" observedRunningTime="2025-12-05 20:44:34.816687647 +0000 UTC m=+1973.627903786" watchObservedRunningTime="2025-12-05 20:44:34.827898283 +0000 UTC m=+1973.639114412" Dec 05 20:44:37 crc kubenswrapper[4904]: I1205 20:44:37.682582 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:44:37 crc kubenswrapper[4904]: E1205 20:44:37.684047 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:44:50 crc kubenswrapper[4904]: I1205 20:44:50.060867 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5wbq6"] Dec 05 20:44:50 crc kubenswrapper[4904]: I1205 20:44:50.070981 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5wbq6"] Dec 05 20:44:51 crc kubenswrapper[4904]: I1205 20:44:51.694659 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93" path="/var/lib/kubelet/pods/1418cfd9-04c9-4bd5-be8b-6f5b3ddf9d93/volumes" Dec 05 20:44:52 crc kubenswrapper[4904]: I1205 20:44:52.681184 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:44:52 crc kubenswrapper[4904]: E1205 20:44:52.682029 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.154630 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7"] Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.157336 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.167872 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.168209 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.184586 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7"] Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.286250 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9336dc5-137a-48b7-910c-bfeae84e73f8-config-volume\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.286308 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58n7q\" (UniqueName: \"kubernetes.io/projected/b9336dc5-137a-48b7-910c-bfeae84e73f8-kube-api-access-58n7q\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.286337 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9336dc5-137a-48b7-910c-bfeae84e73f8-secret-volume\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.388839 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9336dc5-137a-48b7-910c-bfeae84e73f8-config-volume\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.388891 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58n7q\" (UniqueName: \"kubernetes.io/projected/b9336dc5-137a-48b7-910c-bfeae84e73f8-kube-api-access-58n7q\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.388934 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9336dc5-137a-48b7-910c-bfeae84e73f8-secret-volume\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.389867 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9336dc5-137a-48b7-910c-bfeae84e73f8-config-volume\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.395176 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9336dc5-137a-48b7-910c-bfeae84e73f8-secret-volume\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.415148 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58n7q\" (UniqueName: \"kubernetes.io/projected/b9336dc5-137a-48b7-910c-bfeae84e73f8-kube-api-access-58n7q\") pod \"collect-profiles-29416125-59tk7\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.499657 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:00 crc kubenswrapper[4904]: I1205 20:45:00.972149 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7"] Dec 05 20:45:01 crc kubenswrapper[4904]: I1205 20:45:01.064982 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" event={"ID":"b9336dc5-137a-48b7-910c-bfeae84e73f8","Type":"ContainerStarted","Data":"7902901b3e40b1bb3932e4281c0d66228278a39c0ad77dd65d4a256df6b71d6c"} Dec 05 20:45:02 crc kubenswrapper[4904]: I1205 20:45:02.077186 4904 generic.go:334] "Generic (PLEG): container finished" podID="b9336dc5-137a-48b7-910c-bfeae84e73f8" containerID="228b32c7650604773aa3265df6ef4692a8cc6824d7a894885dc55d3c06b2b301" exitCode=0 Dec 05 20:45:02 crc kubenswrapper[4904]: I1205 20:45:02.077296 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" event={"ID":"b9336dc5-137a-48b7-910c-bfeae84e73f8","Type":"ContainerDied","Data":"228b32c7650604773aa3265df6ef4692a8cc6824d7a894885dc55d3c06b2b301"} Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.413863 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.454832 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9336dc5-137a-48b7-910c-bfeae84e73f8-config-volume\") pod \"b9336dc5-137a-48b7-910c-bfeae84e73f8\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.454909 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9336dc5-137a-48b7-910c-bfeae84e73f8-secret-volume\") pod \"b9336dc5-137a-48b7-910c-bfeae84e73f8\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.454959 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58n7q\" (UniqueName: \"kubernetes.io/projected/b9336dc5-137a-48b7-910c-bfeae84e73f8-kube-api-access-58n7q\") pod \"b9336dc5-137a-48b7-910c-bfeae84e73f8\" (UID: \"b9336dc5-137a-48b7-910c-bfeae84e73f8\") " Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.459326 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9336dc5-137a-48b7-910c-bfeae84e73f8-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9336dc5-137a-48b7-910c-bfeae84e73f8" (UID: "b9336dc5-137a-48b7-910c-bfeae84e73f8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.465553 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9336dc5-137a-48b7-910c-bfeae84e73f8-kube-api-access-58n7q" (OuterVolumeSpecName: "kube-api-access-58n7q") pod "b9336dc5-137a-48b7-910c-bfeae84e73f8" (UID: "b9336dc5-137a-48b7-910c-bfeae84e73f8"). InnerVolumeSpecName "kube-api-access-58n7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.474006 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9336dc5-137a-48b7-910c-bfeae84e73f8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9336dc5-137a-48b7-910c-bfeae84e73f8" (UID: "b9336dc5-137a-48b7-910c-bfeae84e73f8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.561734 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9336dc5-137a-48b7-910c-bfeae84e73f8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.561767 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58n7q\" (UniqueName: \"kubernetes.io/projected/b9336dc5-137a-48b7-910c-bfeae84e73f8-kube-api-access-58n7q\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:03 crc kubenswrapper[4904]: I1205 20:45:03.561778 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9336dc5-137a-48b7-910c-bfeae84e73f8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:04 crc kubenswrapper[4904]: I1205 20:45:04.100242 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" event={"ID":"b9336dc5-137a-48b7-910c-bfeae84e73f8","Type":"ContainerDied","Data":"7902901b3e40b1bb3932e4281c0d66228278a39c0ad77dd65d4a256df6b71d6c"} Dec 05 20:45:04 crc kubenswrapper[4904]: I1205 20:45:04.100285 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7902901b3e40b1bb3932e4281c0d66228278a39c0ad77dd65d4a256df6b71d6c" Dec 05 20:45:04 crc kubenswrapper[4904]: I1205 20:45:04.100418 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7" Dec 05 20:45:04 crc kubenswrapper[4904]: I1205 20:45:04.522669 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg"] Dec 05 20:45:04 crc kubenswrapper[4904]: I1205 20:45:04.533465 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-82bgg"] Dec 05 20:45:05 crc kubenswrapper[4904]: I1205 20:45:05.693257 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f5a146-90e5-47e2-a639-78a09eb00231" path="/var/lib/kubelet/pods/d8f5a146-90e5-47e2-a639-78a09eb00231/volumes" Dec 05 20:45:06 crc kubenswrapper[4904]: I1205 20:45:06.681999 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:45:07 crc kubenswrapper[4904]: I1205 20:45:07.133433 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"bb0e1451e9627bb70e29cee9dabc1d09d0d2166e98c624a8840a2d2703a502b5"} Dec 05 20:45:09 crc kubenswrapper[4904]: I1205 20:45:09.030448 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gksp8"] Dec 05 20:45:09 crc kubenswrapper[4904]: I1205 20:45:09.042689 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gksp8"] Dec 05 20:45:09 crc kubenswrapper[4904]: I1205 20:45:09.692781 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9179375b-3935-4856-ae04-76eda9c640e8" path="/var/lib/kubelet/pods/9179375b-3935-4856-ae04-76eda9c640e8/volumes" Dec 05 20:45:12 crc kubenswrapper[4904]: I1205 20:45:12.029810 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gtcmj"] Dec 05 20:45:12 crc kubenswrapper[4904]: I1205 20:45:12.041041 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gtcmj"] Dec 05 20:45:13 crc kubenswrapper[4904]: I1205 20:45:13.695788 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb22604-2972-49be-8650-7d10a049f6a1" path="/var/lib/kubelet/pods/9eb22604-2972-49be-8650-7d10a049f6a1/volumes" Dec 05 20:45:17 crc kubenswrapper[4904]: I1205 20:45:17.259348 4904 generic.go:334] "Generic (PLEG): container finished" podID="1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" containerID="c4821a608051942f6d95c4417fe1e5e5d04b8ff9646da2fae5103e2287cd6827" exitCode=0 Dec 05 20:45:17 crc kubenswrapper[4904]: I1205 20:45:17.259427 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" event={"ID":"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9","Type":"ContainerDied","Data":"c4821a608051942f6d95c4417fe1e5e5d04b8ff9646da2fae5103e2287cd6827"} Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.733639 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.896286 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-inventory\") pod \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.896383 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrq7q\" (UniqueName: \"kubernetes.io/projected/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-kube-api-access-xrq7q\") pod \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.896413 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-ssh-key\") pod \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\" (UID: \"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9\") " Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.921928 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-kube-api-access-xrq7q" (OuterVolumeSpecName: "kube-api-access-xrq7q") pod "1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" (UID: "1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9"). InnerVolumeSpecName "kube-api-access-xrq7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.933219 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-inventory" (OuterVolumeSpecName: "inventory") pod "1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" (UID: "1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:18 crc kubenswrapper[4904]: I1205 20:45:18.957254 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" (UID: "1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:18.999978 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.000027 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrq7q\" (UniqueName: \"kubernetes.io/projected/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-kube-api-access-xrq7q\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.000042 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.003223 4904 scope.go:117] "RemoveContainer" containerID="6f34a1968f9774c914616262214de79af0e4edacb516f81c7ca100eddcf2cd25" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.042774 4904 scope.go:117] "RemoveContainer" containerID="15ec064905ae598d515b775f44fc5816bb7cc6bb5ad621cd03fdd1c25fce23d4" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.087492 4904 scope.go:117] "RemoveContainer" containerID="e340709f8fedd38a304878a78a010292189f0207c0d346ba8f16ce657c950188" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.126397 4904 scope.go:117] "RemoveContainer" containerID="7f8141994963d9b94345c55dbd045be2ef544d0928b51fb3dd7567ff4c83a4f6" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.147482 4904 scope.go:117] "RemoveContainer" containerID="d467899d124c42ed75c8a3219cd306e566a45a6a9aef6c12eb70dcda9fe65448" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.165877 4904 scope.go:117] "RemoveContainer" containerID="399c17b9eab778eecd4bbb77b6381cef2dc8225840f0f905e537be42830ad005" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.184968 4904 scope.go:117] "RemoveContainer" containerID="c0d4254eb1ceb2984a9dae5fd15cb8426a74d073044fbda84b9c899e0b41e03b" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.203225 4904 scope.go:117] "RemoveContainer" containerID="a0846ba4670e5481c10e3461c932d9f5e3d5a6f8df5d8bc05a0058c74f0e1cc1" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.235404 4904 scope.go:117] "RemoveContainer" containerID="5b641c4bef02bf61d3d4d86c900709a3bb8faede9fb0a5d3d324a200a7a104f2" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.269937 4904 scope.go:117] "RemoveContainer" containerID="272fb30c80cf98366309f6898c012cb7513721dcc7c48aeec0773ee1583a0a2a" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.301699 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" event={"ID":"1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9","Type":"ContainerDied","Data":"7837e11d94a38f2712699673111bb7503af2de4631fc2408eba7af7517e85cb6"} Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.301733 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7837e11d94a38f2712699673111bb7503af2de4631fc2408eba7af7517e85cb6" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.301768 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ww8dd" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.400135 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb"] Dec 05 20:45:19 crc kubenswrapper[4904]: E1205 20:45:19.401020 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9336dc5-137a-48b7-910c-bfeae84e73f8" containerName="collect-profiles" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.401131 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9336dc5-137a-48b7-910c-bfeae84e73f8" containerName="collect-profiles" Dec 05 20:45:19 crc kubenswrapper[4904]: E1205 20:45:19.401346 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.401447 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.401766 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.401880 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9336dc5-137a-48b7-910c-bfeae84e73f8" containerName="collect-profiles" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.402693 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.408384 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb"] Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.410965 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.411214 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.411283 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.411394 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.513218 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8fs\" (UniqueName: \"kubernetes.io/projected/d1b27fa3-93b0-4c28-9158-8de58adc4799-kube-api-access-qj8fs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.513305 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.513334 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.615433 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.615494 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.615699 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8fs\" (UniqueName: \"kubernetes.io/projected/d1b27fa3-93b0-4c28-9158-8de58adc4799-kube-api-access-qj8fs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.619938 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.623535 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.632602 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8fs\" (UniqueName: \"kubernetes.io/projected/d1b27fa3-93b0-4c28-9158-8de58adc4799-kube-api-access-qj8fs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:19 crc kubenswrapper[4904]: I1205 20:45:19.736370 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:45:20 crc kubenswrapper[4904]: I1205 20:45:20.083472 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb"] Dec 05 20:45:20 crc kubenswrapper[4904]: I1205 20:45:20.331829 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" event={"ID":"d1b27fa3-93b0-4c28-9158-8de58adc4799","Type":"ContainerStarted","Data":"305c3228555c732c66d83dfe7956ea3c064f584b36394c29a3c33425cc103bfe"} Dec 05 20:45:21 crc kubenswrapper[4904]: I1205 20:45:21.343088 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" event={"ID":"d1b27fa3-93b0-4c28-9158-8de58adc4799","Type":"ContainerStarted","Data":"26e2384b1ce8c8ae366fd3c204fa359464ad9226fbd62db110575d045689392d"} Dec 05 20:45:21 crc kubenswrapper[4904]: I1205 20:45:21.374033 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" podStartSLOduration=1.9522427850000001 podStartE2EDuration="2.373995646s" podCreationTimestamp="2025-12-05 20:45:19 +0000 UTC" firstStartedPulling="2025-12-05 20:45:20.08178527 +0000 UTC m=+2018.893001389" lastFinishedPulling="2025-12-05 20:45:20.503538111 +0000 UTC m=+2019.314754250" observedRunningTime="2025-12-05 20:45:21.367042938 +0000 UTC m=+2020.178259057" watchObservedRunningTime="2025-12-05 20:45:21.373995646 +0000 UTC m=+2020.185211765" Dec 05 20:45:54 crc kubenswrapper[4904]: I1205 20:45:54.054329 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hpprd"] Dec 05 20:45:54 crc kubenswrapper[4904]: I1205 20:45:54.066726 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hpprd"] Dec 05 20:45:55 crc kubenswrapper[4904]: I1205 20:45:55.691920 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28682f8e-fc47-4366-bfdf-14c91d0d0aba" path="/var/lib/kubelet/pods/28682f8e-fc47-4366-bfdf-14c91d0d0aba/volumes" Dec 05 20:46:17 crc kubenswrapper[4904]: I1205 20:46:17.891598 4904 generic.go:334] "Generic (PLEG): container finished" podID="d1b27fa3-93b0-4c28-9158-8de58adc4799" containerID="26e2384b1ce8c8ae366fd3c204fa359464ad9226fbd62db110575d045689392d" exitCode=0 Dec 05 20:46:17 crc kubenswrapper[4904]: I1205 20:46:17.891695 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" event={"ID":"d1b27fa3-93b0-4c28-9158-8de58adc4799","Type":"ContainerDied","Data":"26e2384b1ce8c8ae366fd3c204fa359464ad9226fbd62db110575d045689392d"} Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.336625 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.419482 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-inventory\") pod \"d1b27fa3-93b0-4c28-9158-8de58adc4799\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.419679 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj8fs\" (UniqueName: \"kubernetes.io/projected/d1b27fa3-93b0-4c28-9158-8de58adc4799-kube-api-access-qj8fs\") pod \"d1b27fa3-93b0-4c28-9158-8de58adc4799\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.419756 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-ssh-key\") pod \"d1b27fa3-93b0-4c28-9158-8de58adc4799\" (UID: \"d1b27fa3-93b0-4c28-9158-8de58adc4799\") " Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.448720 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b27fa3-93b0-4c28-9158-8de58adc4799-kube-api-access-qj8fs" (OuterVolumeSpecName: "kube-api-access-qj8fs") pod "d1b27fa3-93b0-4c28-9158-8de58adc4799" (UID: "d1b27fa3-93b0-4c28-9158-8de58adc4799"). InnerVolumeSpecName "kube-api-access-qj8fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.476224 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1b27fa3-93b0-4c28-9158-8de58adc4799" (UID: "d1b27fa3-93b0-4c28-9158-8de58adc4799"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.479262 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-inventory" (OuterVolumeSpecName: "inventory") pod "d1b27fa3-93b0-4c28-9158-8de58adc4799" (UID: "d1b27fa3-93b0-4c28-9158-8de58adc4799"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.523374 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.523415 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj8fs\" (UniqueName: \"kubernetes.io/projected/d1b27fa3-93b0-4c28-9158-8de58adc4799-kube-api-access-qj8fs\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.523429 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1b27fa3-93b0-4c28-9158-8de58adc4799-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.555304 4904 scope.go:117] "RemoveContainer" containerID="d84bae1da041062aef96a208ed4ed936e8c3cfbeeb01e5611c8be7d5c3c7dbe9" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.920130 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" event={"ID":"d1b27fa3-93b0-4c28-9158-8de58adc4799","Type":"ContainerDied","Data":"305c3228555c732c66d83dfe7956ea3c064f584b36394c29a3c33425cc103bfe"} Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.920169 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="305c3228555c732c66d83dfe7956ea3c064f584b36394c29a3c33425cc103bfe" Dec 05 20:46:19 crc kubenswrapper[4904]: I1205 20:46:19.920258 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.059246 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-g44s6"] Dec 05 20:46:20 crc kubenswrapper[4904]: E1205 20:46:20.059733 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b27fa3-93b0-4c28-9158-8de58adc4799" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.059757 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b27fa3-93b0-4c28-9158-8de58adc4799" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.060027 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b27fa3-93b0-4c28-9158-8de58adc4799" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.061040 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.063797 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.064158 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.064893 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.067955 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.071218 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-g44s6"] Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.236942 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.237025 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4s5c\" (UniqueName: \"kubernetes.io/projected/78aea92b-9deb-44c8-b5ac-f9224b038591-kube-api-access-m4s5c\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.237119 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.339460 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.340296 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4s5c\" (UniqueName: \"kubernetes.io/projected/78aea92b-9deb-44c8-b5ac-f9224b038591-kube-api-access-m4s5c\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.340650 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.343922 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.352733 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.360853 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4s5c\" (UniqueName: \"kubernetes.io/projected/78aea92b-9deb-44c8-b5ac-f9224b038591-kube-api-access-m4s5c\") pod \"ssh-known-hosts-edpm-deployment-g44s6\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.378990 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.400960 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lznph"] Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.403333 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.414479 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lznph"] Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.551300 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-catalog-content\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.551596 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsqw\" (UniqueName: \"kubernetes.io/projected/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-kube-api-access-jjsqw\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.551688 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-utilities\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.653628 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-catalog-content\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.653697 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsqw\" (UniqueName: \"kubernetes.io/projected/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-kube-api-access-jjsqw\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.653783 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-utilities\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.654353 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-utilities\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.654582 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-catalog-content\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.676815 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsqw\" (UniqueName: \"kubernetes.io/projected/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-kube-api-access-jjsqw\") pod \"redhat-operators-lznph\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:20 crc kubenswrapper[4904]: I1205 20:46:20.844306 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.028304 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-g44s6"] Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.037768 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:46:21 crc kubenswrapper[4904]: W1205 20:46:21.407257 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b6b7b8c_6c64_4b60_b6a5_def3e7cbe4e3.slice/crio-154d3b1f3be02eb9f11b162d61d5638a4ba4af46d9760051b95c47a1c545d759 WatchSource:0}: Error finding container 154d3b1f3be02eb9f11b162d61d5638a4ba4af46d9760051b95c47a1c545d759: Status 404 returned error can't find the container with id 154d3b1f3be02eb9f11b162d61d5638a4ba4af46d9760051b95c47a1c545d759 Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.407738 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lznph"] Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.940783 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" event={"ID":"78aea92b-9deb-44c8-b5ac-f9224b038591","Type":"ContainerStarted","Data":"5bc9550d4aa5b8b91843f0eb82c439429cc618c055f0e3bb64083aab9907a93d"} Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.941162 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" event={"ID":"78aea92b-9deb-44c8-b5ac-f9224b038591","Type":"ContainerStarted","Data":"f3aea1954706046d59c0c4245f1791e79b07964f5abbaf1bdb73953ac5e183a0"} Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.942851 4904 generic.go:334] "Generic (PLEG): container finished" podID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerID="738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef" exitCode=0 Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.942900 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerDied","Data":"738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef"} Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.942927 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerStarted","Data":"154d3b1f3be02eb9f11b162d61d5638a4ba4af46d9760051b95c47a1c545d759"} Dec 05 20:46:21 crc kubenswrapper[4904]: I1205 20:46:21.960093 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" podStartSLOduration=1.52577253 podStartE2EDuration="1.960074003s" podCreationTimestamp="2025-12-05 20:46:20 +0000 UTC" firstStartedPulling="2025-12-05 20:46:21.037543118 +0000 UTC m=+2079.848759217" lastFinishedPulling="2025-12-05 20:46:21.471844581 +0000 UTC m=+2080.283060690" observedRunningTime="2025-12-05 20:46:21.956046173 +0000 UTC m=+2080.767262302" watchObservedRunningTime="2025-12-05 20:46:21.960074003 +0000 UTC m=+2080.771290122" Dec 05 20:46:22 crc kubenswrapper[4904]: I1205 20:46:22.951542 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerStarted","Data":"a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b"} Dec 05 20:46:25 crc kubenswrapper[4904]: I1205 20:46:25.996730 4904 generic.go:334] "Generic (PLEG): container finished" podID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerID="a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b" exitCode=0 Dec 05 20:46:25 crc kubenswrapper[4904]: I1205 20:46:25.996805 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerDied","Data":"a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b"} Dec 05 20:46:27 crc kubenswrapper[4904]: I1205 20:46:27.012208 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerStarted","Data":"6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894"} Dec 05 20:46:27 crc kubenswrapper[4904]: I1205 20:46:27.079522 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lznph" podStartSLOduration=2.511774595 podStartE2EDuration="7.079500513s" podCreationTimestamp="2025-12-05 20:46:20 +0000 UTC" firstStartedPulling="2025-12-05 20:46:21.944804867 +0000 UTC m=+2080.756020996" lastFinishedPulling="2025-12-05 20:46:26.512530795 +0000 UTC m=+2085.323746914" observedRunningTime="2025-12-05 20:46:27.063199338 +0000 UTC m=+2085.874415497" watchObservedRunningTime="2025-12-05 20:46:27.079500513 +0000 UTC m=+2085.890716632" Dec 05 20:46:30 crc kubenswrapper[4904]: I1205 20:46:30.042359 4904 generic.go:334] "Generic (PLEG): container finished" podID="78aea92b-9deb-44c8-b5ac-f9224b038591" containerID="5bc9550d4aa5b8b91843f0eb82c439429cc618c055f0e3bb64083aab9907a93d" exitCode=0 Dec 05 20:46:30 crc kubenswrapper[4904]: I1205 20:46:30.042472 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" event={"ID":"78aea92b-9deb-44c8-b5ac-f9224b038591","Type":"ContainerDied","Data":"5bc9550d4aa5b8b91843f0eb82c439429cc618c055f0e3bb64083aab9907a93d"} Dec 05 20:46:30 crc kubenswrapper[4904]: I1205 20:46:30.844632 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:30 crc kubenswrapper[4904]: I1205 20:46:30.844693 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.515098 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.677810 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-ssh-key-openstack-edpm-ipam\") pod \"78aea92b-9deb-44c8-b5ac-f9224b038591\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.678182 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-inventory-0\") pod \"78aea92b-9deb-44c8-b5ac-f9224b038591\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.678472 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4s5c\" (UniqueName: \"kubernetes.io/projected/78aea92b-9deb-44c8-b5ac-f9224b038591-kube-api-access-m4s5c\") pod \"78aea92b-9deb-44c8-b5ac-f9224b038591\" (UID: \"78aea92b-9deb-44c8-b5ac-f9224b038591\") " Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.698416 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78aea92b-9deb-44c8-b5ac-f9224b038591-kube-api-access-m4s5c" (OuterVolumeSpecName: "kube-api-access-m4s5c") pod "78aea92b-9deb-44c8-b5ac-f9224b038591" (UID: "78aea92b-9deb-44c8-b5ac-f9224b038591"). InnerVolumeSpecName "kube-api-access-m4s5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.718730 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "78aea92b-9deb-44c8-b5ac-f9224b038591" (UID: "78aea92b-9deb-44c8-b5ac-f9224b038591"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.739227 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "78aea92b-9deb-44c8-b5ac-f9224b038591" (UID: "78aea92b-9deb-44c8-b5ac-f9224b038591"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.781550 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4s5c\" (UniqueName: \"kubernetes.io/projected/78aea92b-9deb-44c8-b5ac-f9224b038591-kube-api-access-m4s5c\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.781590 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.781603 4904 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/78aea92b-9deb-44c8-b5ac-f9224b038591-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:31 crc kubenswrapper[4904]: I1205 20:46:31.888406 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lznph" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="registry-server" probeResult="failure" output=< Dec 05 20:46:31 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 20:46:31 crc kubenswrapper[4904]: > Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.061520 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" event={"ID":"78aea92b-9deb-44c8-b5ac-f9224b038591","Type":"ContainerDied","Data":"f3aea1954706046d59c0c4245f1791e79b07964f5abbaf1bdb73953ac5e183a0"} Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.061565 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3aea1954706046d59c0c4245f1791e79b07964f5abbaf1bdb73953ac5e183a0" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.061687 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g44s6" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.152807 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4"] Dec 05 20:46:32 crc kubenswrapper[4904]: E1205 20:46:32.153553 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78aea92b-9deb-44c8-b5ac-f9224b038591" containerName="ssh-known-hosts-edpm-deployment" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.153573 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="78aea92b-9deb-44c8-b5ac-f9224b038591" containerName="ssh-known-hosts-edpm-deployment" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.153768 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="78aea92b-9deb-44c8-b5ac-f9224b038591" containerName="ssh-known-hosts-edpm-deployment" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.154869 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.160650 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.160659 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.160948 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.161174 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.170392 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4"] Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.319720 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.319776 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.319810 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntlhp\" (UniqueName: \"kubernetes.io/projected/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-kube-api-access-ntlhp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.421909 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.421966 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.421998 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntlhp\" (UniqueName: \"kubernetes.io/projected/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-kube-api-access-ntlhp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.426692 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.426916 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.445594 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntlhp\" (UniqueName: \"kubernetes.io/projected/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-kube-api-access-ntlhp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7dbn4\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:32 crc kubenswrapper[4904]: I1205 20:46:32.470753 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:33 crc kubenswrapper[4904]: I1205 20:46:33.015669 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4"] Dec 05 20:46:33 crc kubenswrapper[4904]: I1205 20:46:33.072439 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" event={"ID":"88dfb504-1f6c-41bc-860e-84eeb0a7fff9","Type":"ContainerStarted","Data":"6a33b801b7242ff2b01a1d57fe76c299a64b8bdbea667ca1f6f4562a4fcf36d8"} Dec 05 20:46:34 crc kubenswrapper[4904]: I1205 20:46:34.082536 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" event={"ID":"88dfb504-1f6c-41bc-860e-84eeb0a7fff9","Type":"ContainerStarted","Data":"96b5f27145b6ff58f86734d5f5e8e84d4fcbe3f1175ca2c2154d452008dae56e"} Dec 05 20:46:34 crc kubenswrapper[4904]: I1205 20:46:34.106568 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" podStartSLOduration=1.69968459 podStartE2EDuration="2.106537324s" podCreationTimestamp="2025-12-05 20:46:32 +0000 UTC" firstStartedPulling="2025-12-05 20:46:33.030716483 +0000 UTC m=+2091.841932602" lastFinishedPulling="2025-12-05 20:46:33.437569217 +0000 UTC m=+2092.248785336" observedRunningTime="2025-12-05 20:46:34.101345652 +0000 UTC m=+2092.912561781" watchObservedRunningTime="2025-12-05 20:46:34.106537324 +0000 UTC m=+2092.917753433" Dec 05 20:46:40 crc kubenswrapper[4904]: I1205 20:46:40.890878 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:40 crc kubenswrapper[4904]: I1205 20:46:40.951307 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:41 crc kubenswrapper[4904]: I1205 20:46:41.134773 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lznph"] Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.160125 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lznph" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="registry-server" containerID="cri-o://6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894" gracePeriod=2 Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.608390 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.754204 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjsqw\" (UniqueName: \"kubernetes.io/projected/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-kube-api-access-jjsqw\") pod \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.754298 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-catalog-content\") pod \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.754355 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-utilities\") pod \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\" (UID: \"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3\") " Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.755569 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-utilities" (OuterVolumeSpecName: "utilities") pod "4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" (UID: "4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.760259 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-kube-api-access-jjsqw" (OuterVolumeSpecName: "kube-api-access-jjsqw") pod "4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" (UID: "4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3"). InnerVolumeSpecName "kube-api-access-jjsqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.858012 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjsqw\" (UniqueName: \"kubernetes.io/projected/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-kube-api-access-jjsqw\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.858161 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.872469 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" (UID: "4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:42 crc kubenswrapper[4904]: I1205 20:46:42.962250 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.171710 4904 generic.go:334] "Generic (PLEG): container finished" podID="88dfb504-1f6c-41bc-860e-84eeb0a7fff9" containerID="96b5f27145b6ff58f86734d5f5e8e84d4fcbe3f1175ca2c2154d452008dae56e" exitCode=0 Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.171785 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" event={"ID":"88dfb504-1f6c-41bc-860e-84eeb0a7fff9","Type":"ContainerDied","Data":"96b5f27145b6ff58f86734d5f5e8e84d4fcbe3f1175ca2c2154d452008dae56e"} Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.174881 4904 generic.go:334] "Generic (PLEG): container finished" podID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerID="6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894" exitCode=0 Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.174930 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerDied","Data":"6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894"} Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.174959 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lznph" event={"ID":"4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3","Type":"ContainerDied","Data":"154d3b1f3be02eb9f11b162d61d5638a4ba4af46d9760051b95c47a1c545d759"} Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.174995 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lznph" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.175006 4904 scope.go:117] "RemoveContainer" containerID="6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.204068 4904 scope.go:117] "RemoveContainer" containerID="a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.213112 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lznph"] Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.223341 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lznph"] Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.231316 4904 scope.go:117] "RemoveContainer" containerID="738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.295423 4904 scope.go:117] "RemoveContainer" containerID="6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894" Dec 05 20:46:43 crc kubenswrapper[4904]: E1205 20:46:43.295837 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894\": container with ID starting with 6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894 not found: ID does not exist" containerID="6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.295868 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894"} err="failed to get container status \"6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894\": rpc error: code = NotFound desc = could not find container \"6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894\": container with ID starting with 6b1bdb2eba304054b9811185f0dd9978ac33bebb4a300bfd1ed1b181fbc75894 not found: ID does not exist" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.295887 4904 scope.go:117] "RemoveContainer" containerID="a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b" Dec 05 20:46:43 crc kubenswrapper[4904]: E1205 20:46:43.296238 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b\": container with ID starting with a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b not found: ID does not exist" containerID="a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.296275 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b"} err="failed to get container status \"a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b\": rpc error: code = NotFound desc = could not find container \"a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b\": container with ID starting with a061985c538a90eb2307ec8452884255ed9f183f4fb0478047653f639f3daa0b not found: ID does not exist" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.296303 4904 scope.go:117] "RemoveContainer" containerID="738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef" Dec 05 20:46:43 crc kubenswrapper[4904]: E1205 20:46:43.296548 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef\": container with ID starting with 738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef not found: ID does not exist" containerID="738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.296568 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef"} err="failed to get container status \"738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef\": rpc error: code = NotFound desc = could not find container \"738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef\": container with ID starting with 738f912d377b1a63ea64362bdf28f895231fa94b65da2bf4e4f71a397249eeef not found: ID does not exist" Dec 05 20:46:43 crc kubenswrapper[4904]: I1205 20:46:43.695050 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" path="/var/lib/kubelet/pods/4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3/volumes" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.603780 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.802349 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntlhp\" (UniqueName: \"kubernetes.io/projected/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-kube-api-access-ntlhp\") pod \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.802488 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-ssh-key\") pod \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.802554 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-inventory\") pod \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\" (UID: \"88dfb504-1f6c-41bc-860e-84eeb0a7fff9\") " Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.807144 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-kube-api-access-ntlhp" (OuterVolumeSpecName: "kube-api-access-ntlhp") pod "88dfb504-1f6c-41bc-860e-84eeb0a7fff9" (UID: "88dfb504-1f6c-41bc-860e-84eeb0a7fff9"). InnerVolumeSpecName "kube-api-access-ntlhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.835186 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88dfb504-1f6c-41bc-860e-84eeb0a7fff9" (UID: "88dfb504-1f6c-41bc-860e-84eeb0a7fff9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.837572 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-inventory" (OuterVolumeSpecName: "inventory") pod "88dfb504-1f6c-41bc-860e-84eeb0a7fff9" (UID: "88dfb504-1f6c-41bc-860e-84eeb0a7fff9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.905026 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntlhp\" (UniqueName: \"kubernetes.io/projected/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-kube-api-access-ntlhp\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.905107 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:44 crc kubenswrapper[4904]: I1205 20:46:44.905123 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88dfb504-1f6c-41bc-860e-84eeb0a7fff9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.196689 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" event={"ID":"88dfb504-1f6c-41bc-860e-84eeb0a7fff9","Type":"ContainerDied","Data":"6a33b801b7242ff2b01a1d57fe76c299a64b8bdbea667ca1f6f4562a4fcf36d8"} Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.196980 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a33b801b7242ff2b01a1d57fe76c299a64b8bdbea667ca1f6f4562a4fcf36d8" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.196730 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7dbn4" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.259581 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w"] Dec 05 20:46:45 crc kubenswrapper[4904]: E1205 20:46:45.259966 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="registry-server" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.259982 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="registry-server" Dec 05 20:46:45 crc kubenswrapper[4904]: E1205 20:46:45.259995 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="extract-utilities" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.260001 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="extract-utilities" Dec 05 20:46:45 crc kubenswrapper[4904]: E1205 20:46:45.260010 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="extract-content" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.260016 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="extract-content" Dec 05 20:46:45 crc kubenswrapper[4904]: E1205 20:46:45.260038 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dfb504-1f6c-41bc-860e-84eeb0a7fff9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.260044 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dfb504-1f6c-41bc-860e-84eeb0a7fff9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.260234 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6b7b8c-6c64-4b60-b6a5-def3e7cbe4e3" containerName="registry-server" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.260250 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="88dfb504-1f6c-41bc-860e-84eeb0a7fff9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.260848 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.263133 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.263323 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.263572 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.264036 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.269639 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w"] Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.415529 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.415619 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gqx\" (UniqueName: \"kubernetes.io/projected/222edc46-4ddd-4236-8635-45b365513214-kube-api-access-p5gqx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.415841 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.518355 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.518497 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.518552 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gqx\" (UniqueName: \"kubernetes.io/projected/222edc46-4ddd-4236-8635-45b365513214-kube-api-access-p5gqx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.524891 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.527167 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.537001 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gqx\" (UniqueName: \"kubernetes.io/projected/222edc46-4ddd-4236-8635-45b365513214-kube-api-access-p5gqx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:45 crc kubenswrapper[4904]: I1205 20:46:45.634236 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:46 crc kubenswrapper[4904]: I1205 20:46:46.155732 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w"] Dec 05 20:46:46 crc kubenswrapper[4904]: W1205 20:46:46.166493 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod222edc46_4ddd_4236_8635_45b365513214.slice/crio-ce370794f2fe879ab0b63f103ef15961b3f66306af5daf215e816b274529e2ba WatchSource:0}: Error finding container ce370794f2fe879ab0b63f103ef15961b3f66306af5daf215e816b274529e2ba: Status 404 returned error can't find the container with id ce370794f2fe879ab0b63f103ef15961b3f66306af5daf215e816b274529e2ba Dec 05 20:46:46 crc kubenswrapper[4904]: I1205 20:46:46.210685 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" event={"ID":"222edc46-4ddd-4236-8635-45b365513214","Type":"ContainerStarted","Data":"ce370794f2fe879ab0b63f103ef15961b3f66306af5daf215e816b274529e2ba"} Dec 05 20:46:47 crc kubenswrapper[4904]: I1205 20:46:47.222680 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" event={"ID":"222edc46-4ddd-4236-8635-45b365513214","Type":"ContainerStarted","Data":"b1b0b865cbe151815d04a5a9d2768d0480dd8484182bc9ea1bb1dfc9ff6b565b"} Dec 05 20:46:47 crc kubenswrapper[4904]: I1205 20:46:47.249268 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" podStartSLOduration=1.83451148 podStartE2EDuration="2.249246939s" podCreationTimestamp="2025-12-05 20:46:45 +0000 UTC" firstStartedPulling="2025-12-05 20:46:46.170964751 +0000 UTC m=+2104.982180860" lastFinishedPulling="2025-12-05 20:46:46.58570021 +0000 UTC m=+2105.396916319" observedRunningTime="2025-12-05 20:46:47.238387193 +0000 UTC m=+2106.049603342" watchObservedRunningTime="2025-12-05 20:46:47.249246939 +0000 UTC m=+2106.060463078" Dec 05 20:46:57 crc kubenswrapper[4904]: I1205 20:46:57.318846 4904 generic.go:334] "Generic (PLEG): container finished" podID="222edc46-4ddd-4236-8635-45b365513214" containerID="b1b0b865cbe151815d04a5a9d2768d0480dd8484182bc9ea1bb1dfc9ff6b565b" exitCode=0 Dec 05 20:46:57 crc kubenswrapper[4904]: I1205 20:46:57.318908 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" event={"ID":"222edc46-4ddd-4236-8635-45b365513214","Type":"ContainerDied","Data":"b1b0b865cbe151815d04a5a9d2768d0480dd8484182bc9ea1bb1dfc9ff6b565b"} Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.756416 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.935450 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5gqx\" (UniqueName: \"kubernetes.io/projected/222edc46-4ddd-4236-8635-45b365513214-kube-api-access-p5gqx\") pod \"222edc46-4ddd-4236-8635-45b365513214\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.935609 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-inventory\") pod \"222edc46-4ddd-4236-8635-45b365513214\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.935642 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-ssh-key\") pod \"222edc46-4ddd-4236-8635-45b365513214\" (UID: \"222edc46-4ddd-4236-8635-45b365513214\") " Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.941939 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222edc46-4ddd-4236-8635-45b365513214-kube-api-access-p5gqx" (OuterVolumeSpecName: "kube-api-access-p5gqx") pod "222edc46-4ddd-4236-8635-45b365513214" (UID: "222edc46-4ddd-4236-8635-45b365513214"). InnerVolumeSpecName "kube-api-access-p5gqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.965385 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "222edc46-4ddd-4236-8635-45b365513214" (UID: "222edc46-4ddd-4236-8635-45b365513214"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:58 crc kubenswrapper[4904]: I1205 20:46:58.975988 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-inventory" (OuterVolumeSpecName: "inventory") pod "222edc46-4ddd-4236-8635-45b365513214" (UID: "222edc46-4ddd-4236-8635-45b365513214"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.037535 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5gqx\" (UniqueName: \"kubernetes.io/projected/222edc46-4ddd-4236-8635-45b365513214-kube-api-access-p5gqx\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.037568 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.037577 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/222edc46-4ddd-4236-8635-45b365513214-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.338222 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" event={"ID":"222edc46-4ddd-4236-8635-45b365513214","Type":"ContainerDied","Data":"ce370794f2fe879ab0b63f103ef15961b3f66306af5daf215e816b274529e2ba"} Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.338553 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce370794f2fe879ab0b63f103ef15961b3f66306af5daf215e816b274529e2ba" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.338301 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.450796 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8"] Dec 05 20:46:59 crc kubenswrapper[4904]: E1205 20:46:59.451239 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222edc46-4ddd-4236-8635-45b365513214" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.451256 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="222edc46-4ddd-4236-8635-45b365513214" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.451459 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="222edc46-4ddd-4236-8635-45b365513214" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.452128 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458049 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458248 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458354 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458361 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458475 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458510 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458577 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.458590 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.463700 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8"] Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.649700 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.649843 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.649915 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.649958 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.650047 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.650120 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrrzx\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-kube-api-access-xrrzx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.650281 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.650358 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.650964 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.651042 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.651101 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.651122 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.651152 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.651175 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753478 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753571 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753620 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753714 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753772 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753822 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753875 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753905 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.753976 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.754023 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.754090 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.754131 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.754170 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.754216 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrrzx\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-kube-api-access-xrrzx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.759936 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.759944 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.761917 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.762964 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.763109 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.763785 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.764334 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.764346 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.764569 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.764834 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.764943 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.765899 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.769123 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.776098 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrrzx\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-kube-api-access-xrrzx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-28gf8\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:46:59 crc kubenswrapper[4904]: I1205 20:46:59.817176 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:47:00 crc kubenswrapper[4904]: I1205 20:47:00.351300 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8"] Dec 05 20:47:01 crc kubenswrapper[4904]: I1205 20:47:01.360792 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" event={"ID":"9edc6269-cd10-4724-9cf3-9b65c80ab8d9","Type":"ContainerStarted","Data":"12d3ea2be0187f8253eb08823f9aa8aca0e1a6df9a303c5d9638cb940b950602"} Dec 05 20:47:01 crc kubenswrapper[4904]: I1205 20:47:01.361433 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" event={"ID":"9edc6269-cd10-4724-9cf3-9b65c80ab8d9","Type":"ContainerStarted","Data":"5f62f42b8907c0b51b185dc40d34d48b26d40be227bb8897e28bc7576ec5a326"} Dec 05 20:47:01 crc kubenswrapper[4904]: I1205 20:47:01.397959 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" podStartSLOduration=2.032866226 podStartE2EDuration="2.397932772s" podCreationTimestamp="2025-12-05 20:46:59 +0000 UTC" firstStartedPulling="2025-12-05 20:47:00.365240956 +0000 UTC m=+2119.176457075" lastFinishedPulling="2025-12-05 20:47:00.730307512 +0000 UTC m=+2119.541523621" observedRunningTime="2025-12-05 20:47:01.383616362 +0000 UTC m=+2120.194832471" watchObservedRunningTime="2025-12-05 20:47:01.397932772 +0000 UTC m=+2120.209148911" Dec 05 20:47:29 crc kubenswrapper[4904]: I1205 20:47:29.956497 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:47:29 crc kubenswrapper[4904]: I1205 20:47:29.957329 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:47:42 crc kubenswrapper[4904]: I1205 20:47:42.774106 4904 generic.go:334] "Generic (PLEG): container finished" podID="9edc6269-cd10-4724-9cf3-9b65c80ab8d9" containerID="12d3ea2be0187f8253eb08823f9aa8aca0e1a6df9a303c5d9638cb940b950602" exitCode=0 Dec 05 20:47:42 crc kubenswrapper[4904]: I1205 20:47:42.774200 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" event={"ID":"9edc6269-cd10-4724-9cf3-9b65c80ab8d9","Type":"ContainerDied","Data":"12d3ea2be0187f8253eb08823f9aa8aca0e1a6df9a303c5d9638cb940b950602"} Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.268992 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441459 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ssh-key\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441577 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-telemetry-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441652 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-bootstrap-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441739 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-nova-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441831 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-neutron-metadata-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441914 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-inventory\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.441970 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrrzx\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-kube-api-access-xrrzx\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442044 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442104 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442136 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-libvirt-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442183 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442265 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-repo-setup-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442348 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.442422 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ovn-combined-ca-bundle\") pod \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\" (UID: \"9edc6269-cd10-4724-9cf3-9b65c80ab8d9\") " Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.449582 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.451776 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.452967 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.453099 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.453192 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.454177 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-kube-api-access-xrrzx" (OuterVolumeSpecName: "kube-api-access-xrrzx") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "kube-api-access-xrrzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.455037 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.455667 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.456295 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.458457 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.459033 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.464293 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.485023 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.486417 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-inventory" (OuterVolumeSpecName: "inventory") pod "9edc6269-cd10-4724-9cf3-9b65c80ab8d9" (UID: "9edc6269-cd10-4724-9cf3-9b65c80ab8d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562239 4904 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562308 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562328 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrrzx\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-kube-api-access-xrrzx\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562344 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562356 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562366 4904 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562380 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562391 4904 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562401 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562414 4904 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562424 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562433 4904 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562442 4904 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.562453 4904 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edc6269-cd10-4724-9cf3-9b65c80ab8d9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.795765 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" event={"ID":"9edc6269-cd10-4724-9cf3-9b65c80ab8d9","Type":"ContainerDied","Data":"5f62f42b8907c0b51b185dc40d34d48b26d40be227bb8897e28bc7576ec5a326"} Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.795805 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f62f42b8907c0b51b185dc40d34d48b26d40be227bb8897e28bc7576ec5a326" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.795841 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-28gf8" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.900258 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt"] Dec 05 20:47:44 crc kubenswrapper[4904]: E1205 20:47:44.900636 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edc6269-cd10-4724-9cf3-9b65c80ab8d9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.900654 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edc6269-cd10-4724-9cf3-9b65c80ab8d9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.900860 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edc6269-cd10-4724-9cf3-9b65c80ab8d9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.902358 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.907458 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.907481 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.907817 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.909781 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.910815 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:47:44 crc kubenswrapper[4904]: I1205 20:47:44.924920 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt"] Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.071961 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.072284 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.072485 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.072655 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.072714 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4l65\" (UniqueName: \"kubernetes.io/projected/e05ac5cc-a4c0-46c3-9beb-3f607156b962-kube-api-access-x4l65\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.175165 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.175240 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.175281 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.175324 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.175347 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4l65\" (UniqueName: \"kubernetes.io/projected/e05ac5cc-a4c0-46c3-9beb-3f607156b962-kube-api-access-x4l65\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.176768 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.179087 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.179360 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.189321 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4l65\" (UniqueName: \"kubernetes.io/projected/e05ac5cc-a4c0-46c3-9beb-3f607156b962-kube-api-access-x4l65\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.191734 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4qvrt\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.224603 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:47:45 crc kubenswrapper[4904]: I1205 20:47:45.803673 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt"] Dec 05 20:47:46 crc kubenswrapper[4904]: I1205 20:47:46.815191 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" event={"ID":"e05ac5cc-a4c0-46c3-9beb-3f607156b962","Type":"ContainerStarted","Data":"bfa388e1568b4af8bdddd672b6839add871ef10542774ee61c6ec0f4a592cb7c"} Dec 05 20:47:46 crc kubenswrapper[4904]: I1205 20:47:46.815567 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" event={"ID":"e05ac5cc-a4c0-46c3-9beb-3f607156b962","Type":"ContainerStarted","Data":"8a7c96c29a0d67322dcc1b23582ca613ff46b5c78b3579c46f09f7701643cfae"} Dec 05 20:47:46 crc kubenswrapper[4904]: I1205 20:47:46.836655 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" podStartSLOduration=2.401215164 podStartE2EDuration="2.836635514s" podCreationTimestamp="2025-12-05 20:47:44 +0000 UTC" firstStartedPulling="2025-12-05 20:47:45.807888505 +0000 UTC m=+2164.619104614" lastFinishedPulling="2025-12-05 20:47:46.243308845 +0000 UTC m=+2165.054524964" observedRunningTime="2025-12-05 20:47:46.833577701 +0000 UTC m=+2165.644793830" watchObservedRunningTime="2025-12-05 20:47:46.836635514 +0000 UTC m=+2165.647851633" Dec 05 20:47:59 crc kubenswrapper[4904]: I1205 20:47:59.955992 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:47:59 crc kubenswrapper[4904]: I1205 20:47:59.956609 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:48:29 crc kubenswrapper[4904]: I1205 20:48:29.955714 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:48:29 crc kubenswrapper[4904]: I1205 20:48:29.956349 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:48:29 crc kubenswrapper[4904]: I1205 20:48:29.956410 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:48:29 crc kubenswrapper[4904]: I1205 20:48:29.957375 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb0e1451e9627bb70e29cee9dabc1d09d0d2166e98c624a8840a2d2703a502b5"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:48:29 crc kubenswrapper[4904]: I1205 20:48:29.957456 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://bb0e1451e9627bb70e29cee9dabc1d09d0d2166e98c624a8840a2d2703a502b5" gracePeriod=600 Dec 05 20:48:30 crc kubenswrapper[4904]: I1205 20:48:30.234822 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="bb0e1451e9627bb70e29cee9dabc1d09d0d2166e98c624a8840a2d2703a502b5" exitCode=0 Dec 05 20:48:30 crc kubenswrapper[4904]: I1205 20:48:30.234866 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"bb0e1451e9627bb70e29cee9dabc1d09d0d2166e98c624a8840a2d2703a502b5"} Dec 05 20:48:30 crc kubenswrapper[4904]: I1205 20:48:30.234912 4904 scope.go:117] "RemoveContainer" containerID="dfcab6d899b1212bd971ed9361e63cd19363ddee7d6637335cdbac95f7926ba1" Dec 05 20:48:31 crc kubenswrapper[4904]: I1205 20:48:31.246148 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881"} Dec 05 20:48:57 crc kubenswrapper[4904]: I1205 20:48:57.482083 4904 generic.go:334] "Generic (PLEG): container finished" podID="e05ac5cc-a4c0-46c3-9beb-3f607156b962" containerID="bfa388e1568b4af8bdddd672b6839add871ef10542774ee61c6ec0f4a592cb7c" exitCode=0 Dec 05 20:48:57 crc kubenswrapper[4904]: I1205 20:48:57.482206 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" event={"ID":"e05ac5cc-a4c0-46c3-9beb-3f607156b962","Type":"ContainerDied","Data":"bfa388e1568b4af8bdddd672b6839add871ef10542774ee61c6ec0f4a592cb7c"} Dec 05 20:48:58 crc kubenswrapper[4904]: I1205 20:48:58.943415 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.037172 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovncontroller-config-0\") pod \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.037209 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovn-combined-ca-bundle\") pod \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.037233 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-inventory\") pod \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.037250 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4l65\" (UniqueName: \"kubernetes.io/projected/e05ac5cc-a4c0-46c3-9beb-3f607156b962-kube-api-access-x4l65\") pod \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.037524 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ssh-key\") pod \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\" (UID: \"e05ac5cc-a4c0-46c3-9beb-3f607156b962\") " Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.044005 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05ac5cc-a4c0-46c3-9beb-3f607156b962-kube-api-access-x4l65" (OuterVolumeSpecName: "kube-api-access-x4l65") pod "e05ac5cc-a4c0-46c3-9beb-3f607156b962" (UID: "e05ac5cc-a4c0-46c3-9beb-3f607156b962"). InnerVolumeSpecName "kube-api-access-x4l65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.044623 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e05ac5cc-a4c0-46c3-9beb-3f607156b962" (UID: "e05ac5cc-a4c0-46c3-9beb-3f607156b962"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.064341 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e05ac5cc-a4c0-46c3-9beb-3f607156b962" (UID: "e05ac5cc-a4c0-46c3-9beb-3f607156b962"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.067827 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e05ac5cc-a4c0-46c3-9beb-3f607156b962" (UID: "e05ac5cc-a4c0-46c3-9beb-3f607156b962"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.073050 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-inventory" (OuterVolumeSpecName: "inventory") pod "e05ac5cc-a4c0-46c3-9beb-3f607156b962" (UID: "e05ac5cc-a4c0-46c3-9beb-3f607156b962"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.140436 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.140476 4904 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.140494 4904 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.140507 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e05ac5cc-a4c0-46c3-9beb-3f607156b962-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.140519 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4l65\" (UniqueName: \"kubernetes.io/projected/e05ac5cc-a4c0-46c3-9beb-3f607156b962-kube-api-access-x4l65\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.500771 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" event={"ID":"e05ac5cc-a4c0-46c3-9beb-3f607156b962","Type":"ContainerDied","Data":"8a7c96c29a0d67322dcc1b23582ca613ff46b5c78b3579c46f09f7701643cfae"} Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.500822 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a7c96c29a0d67322dcc1b23582ca613ff46b5c78b3579c46f09f7701643cfae" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.500868 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4qvrt" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.660421 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr"] Dec 05 20:48:59 crc kubenswrapper[4904]: E1205 20:48:59.660921 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05ac5cc-a4c0-46c3-9beb-3f607156b962" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.660946 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05ac5cc-a4c0-46c3-9beb-3f607156b962" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.661198 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05ac5cc-a4c0-46c3-9beb-3f607156b962" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.662229 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.667883 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.668030 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.668287 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.668333 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.668401 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.668545 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.697746 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr"] Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.752161 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.752900 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tr8z\" (UniqueName: \"kubernetes.io/projected/aee65db4-60e1-4a83-80d0-81e90f6f4f07-kube-api-access-6tr8z\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.753601 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.753786 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.753939 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.754157 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.857266 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.857708 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.857932 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.858796 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.859119 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tr8z\" (UniqueName: \"kubernetes.io/projected/aee65db4-60e1-4a83-80d0-81e90f6f4f07-kube-api-access-6tr8z\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.859296 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.862478 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.863550 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.866719 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.868087 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.869249 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.883710 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tr8z\" (UniqueName: \"kubernetes.io/projected/aee65db4-60e1-4a83-80d0-81e90f6f4f07-kube-api-access-6tr8z\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:48:59 crc kubenswrapper[4904]: I1205 20:48:59.985126 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:49:00 crc kubenswrapper[4904]: I1205 20:49:00.551812 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr"] Dec 05 20:49:01 crc kubenswrapper[4904]: I1205 20:49:01.519764 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" event={"ID":"aee65db4-60e1-4a83-80d0-81e90f6f4f07","Type":"ContainerStarted","Data":"ea1a69c52983c7897db3f7b9d1bd47db030ff0b0bc0f219c69623cd801fe0795"} Dec 05 20:49:01 crc kubenswrapper[4904]: I1205 20:49:01.520019 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" event={"ID":"aee65db4-60e1-4a83-80d0-81e90f6f4f07","Type":"ContainerStarted","Data":"1b0c0d49c03dee422e37905e30b447d0f4f2d5d350ed7c8491a16e5b96b45ff7"} Dec 05 20:49:01 crc kubenswrapper[4904]: I1205 20:49:01.552962 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" podStartSLOduration=2.121843128 podStartE2EDuration="2.552903558s" podCreationTimestamp="2025-12-05 20:48:59 +0000 UTC" firstStartedPulling="2025-12-05 20:49:00.563516901 +0000 UTC m=+2239.374733010" lastFinishedPulling="2025-12-05 20:49:00.994577291 +0000 UTC m=+2239.805793440" observedRunningTime="2025-12-05 20:49:01.543610714 +0000 UTC m=+2240.354826833" watchObservedRunningTime="2025-12-05 20:49:01.552903558 +0000 UTC m=+2240.364119717" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.025608 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lg4xz"] Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.028613 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.039484 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lg4xz"] Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.178563 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6nks\" (UniqueName: \"kubernetes.io/projected/4c4ba21b-37da-4e26-a5a5-13b704664331-kube-api-access-b6nks\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.178929 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-catalog-content\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.178953 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-utilities\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.280982 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-catalog-content\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.281019 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-utilities\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.281130 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6nks\" (UniqueName: \"kubernetes.io/projected/4c4ba21b-37da-4e26-a5a5-13b704664331-kube-api-access-b6nks\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.281514 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-catalog-content\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.281584 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-utilities\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.300573 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6nks\" (UniqueName: \"kubernetes.io/projected/4c4ba21b-37da-4e26-a5a5-13b704664331-kube-api-access-b6nks\") pod \"community-operators-lg4xz\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:05 crc kubenswrapper[4904]: I1205 20:49:05.346930 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:06 crc kubenswrapper[4904]: W1205 20:49:06.007557 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c4ba21b_37da_4e26_a5a5_13b704664331.slice/crio-dd1e618cc0dbe874b0cbd41ed916fe73cbb3a4f6bdff30b0a8eef971c7a6f48a WatchSource:0}: Error finding container dd1e618cc0dbe874b0cbd41ed916fe73cbb3a4f6bdff30b0a8eef971c7a6f48a: Status 404 returned error can't find the container with id dd1e618cc0dbe874b0cbd41ed916fe73cbb3a4f6bdff30b0a8eef971c7a6f48a Dec 05 20:49:06 crc kubenswrapper[4904]: I1205 20:49:06.010325 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lg4xz"] Dec 05 20:49:06 crc kubenswrapper[4904]: I1205 20:49:06.588869 4904 generic.go:334] "Generic (PLEG): container finished" podID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerID="46627ceb535cd7523b84ad827915b0feff2570d96a7fe4c633b4669a061c42d2" exitCode=0 Dec 05 20:49:06 crc kubenswrapper[4904]: I1205 20:49:06.588965 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerDied","Data":"46627ceb535cd7523b84ad827915b0feff2570d96a7fe4c633b4669a061c42d2"} Dec 05 20:49:06 crc kubenswrapper[4904]: I1205 20:49:06.589149 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerStarted","Data":"dd1e618cc0dbe874b0cbd41ed916fe73cbb3a4f6bdff30b0a8eef971c7a6f48a"} Dec 05 20:49:07 crc kubenswrapper[4904]: I1205 20:49:07.599918 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerStarted","Data":"8ea1e2452f9fef0584898ad4783e67f53fbfdf6e779e1dc3c629880d1655a136"} Dec 05 20:49:08 crc kubenswrapper[4904]: I1205 20:49:08.610478 4904 generic.go:334] "Generic (PLEG): container finished" podID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerID="8ea1e2452f9fef0584898ad4783e67f53fbfdf6e779e1dc3c629880d1655a136" exitCode=0 Dec 05 20:49:08 crc kubenswrapper[4904]: I1205 20:49:08.610522 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerDied","Data":"8ea1e2452f9fef0584898ad4783e67f53fbfdf6e779e1dc3c629880d1655a136"} Dec 05 20:49:09 crc kubenswrapper[4904]: I1205 20:49:09.622496 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerStarted","Data":"f3c6176c475b35c1a89f4731e726c54ee27ea483ea37a147d963d3e72f45446e"} Dec 05 20:49:09 crc kubenswrapper[4904]: I1205 20:49:09.640483 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lg4xz" podStartSLOduration=2.013896233 podStartE2EDuration="4.64044631s" podCreationTimestamp="2025-12-05 20:49:05 +0000 UTC" firstStartedPulling="2025-12-05 20:49:06.591730006 +0000 UTC m=+2245.402946115" lastFinishedPulling="2025-12-05 20:49:09.218280083 +0000 UTC m=+2248.029496192" observedRunningTime="2025-12-05 20:49:09.63895639 +0000 UTC m=+2248.450172509" watchObservedRunningTime="2025-12-05 20:49:09.64044631 +0000 UTC m=+2248.451662419" Dec 05 20:49:15 crc kubenswrapper[4904]: I1205 20:49:15.347719 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:15 crc kubenswrapper[4904]: I1205 20:49:15.348103 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:15 crc kubenswrapper[4904]: I1205 20:49:15.419883 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:15 crc kubenswrapper[4904]: I1205 20:49:15.748132 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:15 crc kubenswrapper[4904]: I1205 20:49:15.800090 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lg4xz"] Dec 05 20:49:17 crc kubenswrapper[4904]: I1205 20:49:17.696654 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lg4xz" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="registry-server" containerID="cri-o://f3c6176c475b35c1a89f4731e726c54ee27ea483ea37a147d963d3e72f45446e" gracePeriod=2 Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.708291 4904 generic.go:334] "Generic (PLEG): container finished" podID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerID="f3c6176c475b35c1a89f4731e726c54ee27ea483ea37a147d963d3e72f45446e" exitCode=0 Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.708625 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerDied","Data":"f3c6176c475b35c1a89f4731e726c54ee27ea483ea37a147d963d3e72f45446e"} Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.708658 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lg4xz" event={"ID":"4c4ba21b-37da-4e26-a5a5-13b704664331","Type":"ContainerDied","Data":"dd1e618cc0dbe874b0cbd41ed916fe73cbb3a4f6bdff30b0a8eef971c7a6f48a"} Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.708674 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd1e618cc0dbe874b0cbd41ed916fe73cbb3a4f6bdff30b0a8eef971c7a6f48a" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.727329 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.772696 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6nks\" (UniqueName: \"kubernetes.io/projected/4c4ba21b-37da-4e26-a5a5-13b704664331-kube-api-access-b6nks\") pod \"4c4ba21b-37da-4e26-a5a5-13b704664331\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.772743 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-utilities\") pod \"4c4ba21b-37da-4e26-a5a5-13b704664331\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.772848 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-catalog-content\") pod \"4c4ba21b-37da-4e26-a5a5-13b704664331\" (UID: \"4c4ba21b-37da-4e26-a5a5-13b704664331\") " Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.781390 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4ba21b-37da-4e26-a5a5-13b704664331-kube-api-access-b6nks" (OuterVolumeSpecName: "kube-api-access-b6nks") pod "4c4ba21b-37da-4e26-a5a5-13b704664331" (UID: "4c4ba21b-37da-4e26-a5a5-13b704664331"). InnerVolumeSpecName "kube-api-access-b6nks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.789926 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-utilities" (OuterVolumeSpecName: "utilities") pod "4c4ba21b-37da-4e26-a5a5-13b704664331" (UID: "4c4ba21b-37da-4e26-a5a5-13b704664331"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.830838 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c4ba21b-37da-4e26-a5a5-13b704664331" (UID: "4c4ba21b-37da-4e26-a5a5-13b704664331"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.875791 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.875822 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6nks\" (UniqueName: \"kubernetes.io/projected/4c4ba21b-37da-4e26-a5a5-13b704664331-kube-api-access-b6nks\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:18 crc kubenswrapper[4904]: I1205 20:49:18.875835 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c4ba21b-37da-4e26-a5a5-13b704664331-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:19 crc kubenswrapper[4904]: I1205 20:49:19.715778 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lg4xz" Dec 05 20:49:19 crc kubenswrapper[4904]: I1205 20:49:19.743141 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lg4xz"] Dec 05 20:49:19 crc kubenswrapper[4904]: I1205 20:49:19.753873 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lg4xz"] Dec 05 20:49:21 crc kubenswrapper[4904]: I1205 20:49:21.695367 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" path="/var/lib/kubelet/pods/4c4ba21b-37da-4e26-a5a5-13b704664331/volumes" Dec 05 20:49:55 crc kubenswrapper[4904]: I1205 20:49:55.065649 4904 generic.go:334] "Generic (PLEG): container finished" podID="aee65db4-60e1-4a83-80d0-81e90f6f4f07" containerID="ea1a69c52983c7897db3f7b9d1bd47db030ff0b0bc0f219c69623cd801fe0795" exitCode=0 Dec 05 20:49:55 crc kubenswrapper[4904]: I1205 20:49:55.065702 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" event={"ID":"aee65db4-60e1-4a83-80d0-81e90f6f4f07","Type":"ContainerDied","Data":"ea1a69c52983c7897db3f7b9d1bd47db030ff0b0bc0f219c69623cd801fe0795"} Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.518973 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.600170 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-inventory\") pod \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.600385 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-ovn-metadata-agent-neutron-config-0\") pod \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.600469 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-ssh-key\") pod \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.600502 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-nova-metadata-neutron-config-0\") pod \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.600534 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-metadata-combined-ca-bundle\") pod \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.601144 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tr8z\" (UniqueName: \"kubernetes.io/projected/aee65db4-60e1-4a83-80d0-81e90f6f4f07-kube-api-access-6tr8z\") pod \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\" (UID: \"aee65db4-60e1-4a83-80d0-81e90f6f4f07\") " Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.636293 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "aee65db4-60e1-4a83-80d0-81e90f6f4f07" (UID: "aee65db4-60e1-4a83-80d0-81e90f6f4f07"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.636956 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee65db4-60e1-4a83-80d0-81e90f6f4f07-kube-api-access-6tr8z" (OuterVolumeSpecName: "kube-api-access-6tr8z") pod "aee65db4-60e1-4a83-80d0-81e90f6f4f07" (UID: "aee65db4-60e1-4a83-80d0-81e90f6f4f07"). InnerVolumeSpecName "kube-api-access-6tr8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.707611 4904 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.707647 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tr8z\" (UniqueName: \"kubernetes.io/projected/aee65db4-60e1-4a83-80d0-81e90f6f4f07-kube-api-access-6tr8z\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.709174 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aee65db4-60e1-4a83-80d0-81e90f6f4f07" (UID: "aee65db4-60e1-4a83-80d0-81e90f6f4f07"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.726792 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "aee65db4-60e1-4a83-80d0-81e90f6f4f07" (UID: "aee65db4-60e1-4a83-80d0-81e90f6f4f07"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.727391 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-inventory" (OuterVolumeSpecName: "inventory") pod "aee65db4-60e1-4a83-80d0-81e90f6f4f07" (UID: "aee65db4-60e1-4a83-80d0-81e90f6f4f07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.727808 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "aee65db4-60e1-4a83-80d0-81e90f6f4f07" (UID: "aee65db4-60e1-4a83-80d0-81e90f6f4f07"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.810624 4904 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.810655 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.810664 4904 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:56 crc kubenswrapper[4904]: I1205 20:49:56.810676 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee65db4-60e1-4a83-80d0-81e90f6f4f07-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.086786 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" event={"ID":"aee65db4-60e1-4a83-80d0-81e90f6f4f07","Type":"ContainerDied","Data":"1b0c0d49c03dee422e37905e30b447d0f4f2d5d350ed7c8491a16e5b96b45ff7"} Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.086830 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0c0d49c03dee422e37905e30b447d0f4f2d5d350ed7c8491a16e5b96b45ff7" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.086863 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.242542 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g"] Dec 05 20:49:57 crc kubenswrapper[4904]: E1205 20:49:57.242923 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="registry-server" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.242939 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="registry-server" Dec 05 20:49:57 crc kubenswrapper[4904]: E1205 20:49:57.242954 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="extract-utilities" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.242962 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="extract-utilities" Dec 05 20:49:57 crc kubenswrapper[4904]: E1205 20:49:57.242980 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee65db4-60e1-4a83-80d0-81e90f6f4f07" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.242988 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee65db4-60e1-4a83-80d0-81e90f6f4f07" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 20:49:57 crc kubenswrapper[4904]: E1205 20:49:57.242996 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="extract-content" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.243002 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="extract-content" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.243214 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4ba21b-37da-4e26-a5a5-13b704664331" containerName="registry-server" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.243235 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee65db4-60e1-4a83-80d0-81e90f6f4f07" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.243851 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.281525 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.281689 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.281969 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.282224 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.282720 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.292636 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g"] Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.319760 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.320083 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.320272 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltxg\" (UniqueName: \"kubernetes.io/projected/11479a0b-d4c6-4770-bdf1-dbb8a417384d-kube-api-access-vltxg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.320407 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.320622 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.422578 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.422624 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.422674 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltxg\" (UniqueName: \"kubernetes.io/projected/11479a0b-d4c6-4770-bdf1-dbb8a417384d-kube-api-access-vltxg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.422700 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.422755 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.429540 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.429835 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.430674 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.430675 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.448651 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltxg\" (UniqueName: \"kubernetes.io/projected/11479a0b-d4c6-4770-bdf1-dbb8a417384d-kube-api-access-vltxg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5548g\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:57 crc kubenswrapper[4904]: I1205 20:49:57.606220 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:49:58 crc kubenswrapper[4904]: I1205 20:49:58.167027 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g"] Dec 05 20:49:59 crc kubenswrapper[4904]: I1205 20:49:59.105431 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" event={"ID":"11479a0b-d4c6-4770-bdf1-dbb8a417384d","Type":"ContainerStarted","Data":"4be5d33997d22dbadb8341100525fe1d9962d44704e93cce783007d63640d6c8"} Dec 05 20:49:59 crc kubenswrapper[4904]: I1205 20:49:59.105825 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" event={"ID":"11479a0b-d4c6-4770-bdf1-dbb8a417384d","Type":"ContainerStarted","Data":"95d40f408d0a3ee2dcf5d86ddc03e7917d1d4cf7c795adafe890d2edf811ad7f"} Dec 05 20:49:59 crc kubenswrapper[4904]: I1205 20:49:59.123121 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" podStartSLOduration=1.654478928 podStartE2EDuration="2.12309365s" podCreationTimestamp="2025-12-05 20:49:57 +0000 UTC" firstStartedPulling="2025-12-05 20:49:58.174773373 +0000 UTC m=+2296.985989482" lastFinishedPulling="2025-12-05 20:49:58.643388095 +0000 UTC m=+2297.454604204" observedRunningTime="2025-12-05 20:49:59.118952558 +0000 UTC m=+2297.930168677" watchObservedRunningTime="2025-12-05 20:49:59.12309365 +0000 UTC m=+2297.934309759" Dec 05 20:50:59 crc kubenswrapper[4904]: I1205 20:50:59.965615 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:50:59 crc kubenswrapper[4904]: I1205 20:50:59.966202 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:51:29 crc kubenswrapper[4904]: I1205 20:51:29.955872 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:51:29 crc kubenswrapper[4904]: I1205 20:51:29.956652 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:51:59 crc kubenswrapper[4904]: I1205 20:51:59.957730 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:51:59 crc kubenswrapper[4904]: I1205 20:51:59.958335 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:51:59 crc kubenswrapper[4904]: I1205 20:51:59.958387 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 20:51:59 crc kubenswrapper[4904]: I1205 20:51:59.959002 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:51:59 crc kubenswrapper[4904]: I1205 20:51:59.959095 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" gracePeriod=600 Dec 05 20:52:00 crc kubenswrapper[4904]: E1205 20:52:00.103586 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:52:00 crc kubenswrapper[4904]: I1205 20:52:00.472403 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881"} Dec 05 20:52:00 crc kubenswrapper[4904]: I1205 20:52:00.472537 4904 scope.go:117] "RemoveContainer" containerID="bb0e1451e9627bb70e29cee9dabc1d09d0d2166e98c624a8840a2d2703a502b5" Dec 05 20:52:00 crc kubenswrapper[4904]: I1205 20:52:00.472425 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" exitCode=0 Dec 05 20:52:00 crc kubenswrapper[4904]: I1205 20:52:00.473257 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:52:00 crc kubenswrapper[4904]: E1205 20:52:00.473674 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:52:10 crc kubenswrapper[4904]: I1205 20:52:10.681597 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:52:10 crc kubenswrapper[4904]: E1205 20:52:10.683279 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:52:23 crc kubenswrapper[4904]: I1205 20:52:23.681140 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:52:23 crc kubenswrapper[4904]: E1205 20:52:23.682000 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:52:36 crc kubenswrapper[4904]: I1205 20:52:36.681960 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:52:36 crc kubenswrapper[4904]: E1205 20:52:36.682866 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:52:49 crc kubenswrapper[4904]: I1205 20:52:49.681051 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:52:49 crc kubenswrapper[4904]: E1205 20:52:49.681717 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:53:03 crc kubenswrapper[4904]: I1205 20:53:03.681327 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:53:03 crc kubenswrapper[4904]: E1205 20:53:03.685192 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:53:14 crc kubenswrapper[4904]: I1205 20:53:14.681211 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:53:14 crc kubenswrapper[4904]: E1205 20:53:14.682220 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:53:28 crc kubenswrapper[4904]: I1205 20:53:28.682016 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:53:28 crc kubenswrapper[4904]: E1205 20:53:28.682957 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:53:41 crc kubenswrapper[4904]: I1205 20:53:41.688011 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:53:41 crc kubenswrapper[4904]: E1205 20:53:41.688735 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:53:52 crc kubenswrapper[4904]: I1205 20:53:52.681911 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:53:52 crc kubenswrapper[4904]: E1205 20:53:52.683157 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:54:00 crc kubenswrapper[4904]: I1205 20:54:00.844691 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p8xrz"] Dec 05 20:54:00 crc kubenswrapper[4904]: I1205 20:54:00.848538 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:00 crc kubenswrapper[4904]: I1205 20:54:00.897509 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8xrz"] Dec 05 20:54:00 crc kubenswrapper[4904]: I1205 20:54:00.991691 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-utilities\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:00 crc kubenswrapper[4904]: I1205 20:54:00.992201 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-catalog-content\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:00 crc kubenswrapper[4904]: I1205 20:54:00.992726 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xklnc\" (UniqueName: \"kubernetes.io/projected/712f269c-ea33-41c9-84f8-1d998d4ee803-kube-api-access-xklnc\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.094985 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-catalog-content\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.095111 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xklnc\" (UniqueName: \"kubernetes.io/projected/712f269c-ea33-41c9-84f8-1d998d4ee803-kube-api-access-xklnc\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.095166 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-utilities\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.095636 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-catalog-content\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.095707 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-utilities\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.128783 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xklnc\" (UniqueName: \"kubernetes.io/projected/712f269c-ea33-41c9-84f8-1d998d4ee803-kube-api-access-xklnc\") pod \"redhat-marketplace-p8xrz\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.168878 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.661217 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8xrz"] Dec 05 20:54:01 crc kubenswrapper[4904]: I1205 20:54:01.762161 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerStarted","Data":"80831d979d10266fc05b327e18998ff6bff9c4f9f904736925709259bc2f4992"} Dec 05 20:54:02 crc kubenswrapper[4904]: I1205 20:54:02.776717 4904 generic.go:334] "Generic (PLEG): container finished" podID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerID="0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863" exitCode=0 Dec 05 20:54:02 crc kubenswrapper[4904]: I1205 20:54:02.776838 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerDied","Data":"0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863"} Dec 05 20:54:02 crc kubenswrapper[4904]: I1205 20:54:02.779876 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:54:03 crc kubenswrapper[4904]: I1205 20:54:03.682045 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:54:03 crc kubenswrapper[4904]: E1205 20:54:03.682583 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:54:03 crc kubenswrapper[4904]: I1205 20:54:03.790786 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerStarted","Data":"69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3"} Dec 05 20:54:04 crc kubenswrapper[4904]: I1205 20:54:04.801248 4904 generic.go:334] "Generic (PLEG): container finished" podID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerID="69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3" exitCode=0 Dec 05 20:54:04 crc kubenswrapper[4904]: I1205 20:54:04.801569 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerDied","Data":"69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3"} Dec 05 20:54:05 crc kubenswrapper[4904]: I1205 20:54:05.818505 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerStarted","Data":"dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5"} Dec 05 20:54:05 crc kubenswrapper[4904]: I1205 20:54:05.838282 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p8xrz" podStartSLOduration=3.333928088 podStartE2EDuration="5.838253685s" podCreationTimestamp="2025-12-05 20:54:00 +0000 UTC" firstStartedPulling="2025-12-05 20:54:02.779472879 +0000 UTC m=+2541.590689008" lastFinishedPulling="2025-12-05 20:54:05.283798496 +0000 UTC m=+2544.095014605" observedRunningTime="2025-12-05 20:54:05.834104292 +0000 UTC m=+2544.645320421" watchObservedRunningTime="2025-12-05 20:54:05.838253685 +0000 UTC m=+2544.649469794" Dec 05 20:54:11 crc kubenswrapper[4904]: I1205 20:54:11.170347 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:11 crc kubenswrapper[4904]: I1205 20:54:11.171282 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:11 crc kubenswrapper[4904]: I1205 20:54:11.220029 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:11 crc kubenswrapper[4904]: I1205 20:54:11.931786 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:11 crc kubenswrapper[4904]: I1205 20:54:11.995285 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8xrz"] Dec 05 20:54:13 crc kubenswrapper[4904]: I1205 20:54:13.872799 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s4hsh"] Dec 05 20:54:13 crc kubenswrapper[4904]: I1205 20:54:13.875840 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:13 crc kubenswrapper[4904]: I1205 20:54:13.893806 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4hsh"] Dec 05 20:54:13 crc kubenswrapper[4904]: I1205 20:54:13.929300 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p8xrz" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="registry-server" containerID="cri-o://dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5" gracePeriod=2 Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.029643 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-catalog-content\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.029715 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwl9c\" (UniqueName: \"kubernetes.io/projected/17758ae4-e064-49a9-99a2-e374e8c3ecae-kube-api-access-mwl9c\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.030455 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-utilities\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.132500 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-utilities\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.133020 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-catalog-content\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.133084 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwl9c\" (UniqueName: \"kubernetes.io/projected/17758ae4-e064-49a9-99a2-e374e8c3ecae-kube-api-access-mwl9c\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.133108 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-utilities\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.133423 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-catalog-content\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.157018 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwl9c\" (UniqueName: \"kubernetes.io/projected/17758ae4-e064-49a9-99a2-e374e8c3ecae-kube-api-access-mwl9c\") pod \"certified-operators-s4hsh\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.245768 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.442847 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.572366 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xklnc\" (UniqueName: \"kubernetes.io/projected/712f269c-ea33-41c9-84f8-1d998d4ee803-kube-api-access-xklnc\") pod \"712f269c-ea33-41c9-84f8-1d998d4ee803\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.573183 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-utilities\") pod \"712f269c-ea33-41c9-84f8-1d998d4ee803\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.573485 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-catalog-content\") pod \"712f269c-ea33-41c9-84f8-1d998d4ee803\" (UID: \"712f269c-ea33-41c9-84f8-1d998d4ee803\") " Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.573920 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-utilities" (OuterVolumeSpecName: "utilities") pod "712f269c-ea33-41c9-84f8-1d998d4ee803" (UID: "712f269c-ea33-41c9-84f8-1d998d4ee803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.579253 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712f269c-ea33-41c9-84f8-1d998d4ee803-kube-api-access-xklnc" (OuterVolumeSpecName: "kube-api-access-xklnc") pod "712f269c-ea33-41c9-84f8-1d998d4ee803" (UID: "712f269c-ea33-41c9-84f8-1d998d4ee803"). InnerVolumeSpecName "kube-api-access-xklnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.607994 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "712f269c-ea33-41c9-84f8-1d998d4ee803" (UID: "712f269c-ea33-41c9-84f8-1d998d4ee803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.675629 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.675673 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xklnc\" (UniqueName: \"kubernetes.io/projected/712f269c-ea33-41c9-84f8-1d998d4ee803-kube-api-access-xklnc\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.675689 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f269c-ea33-41c9-84f8-1d998d4ee803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.781982 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4hsh"] Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.936708 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerStarted","Data":"3b29ffa20c3b2f8b42bb384fb1a636df2d4b87ec3a0a63b152abda12bb9ca9d4"} Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.938785 4904 generic.go:334] "Generic (PLEG): container finished" podID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerID="dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5" exitCode=0 Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.938803 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerDied","Data":"dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5"} Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.938826 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8xrz" event={"ID":"712f269c-ea33-41c9-84f8-1d998d4ee803","Type":"ContainerDied","Data":"80831d979d10266fc05b327e18998ff6bff9c4f9f904736925709259bc2f4992"} Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.938843 4904 scope.go:117] "RemoveContainer" containerID="dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.938879 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8xrz" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.970784 4904 scope.go:117] "RemoveContainer" containerID="69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.982410 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8xrz"] Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.993295 4904 scope.go:117] "RemoveContainer" containerID="0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863" Dec 05 20:54:14 crc kubenswrapper[4904]: I1205 20:54:14.999524 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8xrz"] Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.017268 4904 scope.go:117] "RemoveContainer" containerID="dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5" Dec 05 20:54:15 crc kubenswrapper[4904]: E1205 20:54:15.017805 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5\": container with ID starting with dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5 not found: ID does not exist" containerID="dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.017843 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5"} err="failed to get container status \"dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5\": rpc error: code = NotFound desc = could not find container \"dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5\": container with ID starting with dc183fea06e0bc15bc1fc530f0fd57396ddc805e52a9cf33e90b6d9fbbf1afa5 not found: ID does not exist" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.017868 4904 scope.go:117] "RemoveContainer" containerID="69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3" Dec 05 20:54:15 crc kubenswrapper[4904]: E1205 20:54:15.018233 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3\": container with ID starting with 69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3 not found: ID does not exist" containerID="69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.018253 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3"} err="failed to get container status \"69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3\": rpc error: code = NotFound desc = could not find container \"69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3\": container with ID starting with 69bb0eda8492968f994fd457a924e85cb655c5f07f78fafb64fa72dfd3a913a3 not found: ID does not exist" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.018266 4904 scope.go:117] "RemoveContainer" containerID="0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863" Dec 05 20:54:15 crc kubenswrapper[4904]: E1205 20:54:15.018567 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863\": container with ID starting with 0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863 not found: ID does not exist" containerID="0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.018607 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863"} err="failed to get container status \"0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863\": rpc error: code = NotFound desc = could not find container \"0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863\": container with ID starting with 0ab9ca8e14670ddcdb549d83910ec94a80e98725d4e6b460510afd20ff41a863 not found: ID does not exist" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.694503 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" path="/var/lib/kubelet/pods/712f269c-ea33-41c9-84f8-1d998d4ee803/volumes" Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.959541 4904 generic.go:334] "Generic (PLEG): container finished" podID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerID="dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72" exitCode=0 Dec 05 20:54:15 crc kubenswrapper[4904]: I1205 20:54:15.959719 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerDied","Data":"dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72"} Dec 05 20:54:16 crc kubenswrapper[4904]: I1205 20:54:16.682011 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:54:16 crc kubenswrapper[4904]: E1205 20:54:16.682532 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:54:17 crc kubenswrapper[4904]: I1205 20:54:17.981154 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerStarted","Data":"5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86"} Dec 05 20:54:18 crc kubenswrapper[4904]: I1205 20:54:18.997348 4904 generic.go:334] "Generic (PLEG): container finished" podID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerID="5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86" exitCode=0 Dec 05 20:54:18 crc kubenswrapper[4904]: I1205 20:54:18.997391 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerDied","Data":"5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86"} Dec 05 20:54:20 crc kubenswrapper[4904]: I1205 20:54:20.009647 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerStarted","Data":"e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f"} Dec 05 20:54:20 crc kubenswrapper[4904]: I1205 20:54:20.039417 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s4hsh" podStartSLOduration=3.574226207 podStartE2EDuration="7.039397937s" podCreationTimestamp="2025-12-05 20:54:13 +0000 UTC" firstStartedPulling="2025-12-05 20:54:15.962175884 +0000 UTC m=+2554.773391993" lastFinishedPulling="2025-12-05 20:54:19.427347594 +0000 UTC m=+2558.238563723" observedRunningTime="2025-12-05 20:54:20.032392926 +0000 UTC m=+2558.843609035" watchObservedRunningTime="2025-12-05 20:54:20.039397937 +0000 UTC m=+2558.850614046" Dec 05 20:54:24 crc kubenswrapper[4904]: I1205 20:54:24.246793 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:24 crc kubenswrapper[4904]: I1205 20:54:24.247099 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:24 crc kubenswrapper[4904]: I1205 20:54:24.297441 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:25 crc kubenswrapper[4904]: I1205 20:54:25.106658 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:25 crc kubenswrapper[4904]: I1205 20:54:25.182282 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4hsh"] Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.069346 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s4hsh" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="registry-server" containerID="cri-o://e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f" gracePeriod=2 Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.548813 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.604570 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-utilities\") pod \"17758ae4-e064-49a9-99a2-e374e8c3ecae\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.604628 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-catalog-content\") pod \"17758ae4-e064-49a9-99a2-e374e8c3ecae\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.604834 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwl9c\" (UniqueName: \"kubernetes.io/projected/17758ae4-e064-49a9-99a2-e374e8c3ecae-kube-api-access-mwl9c\") pod \"17758ae4-e064-49a9-99a2-e374e8c3ecae\" (UID: \"17758ae4-e064-49a9-99a2-e374e8c3ecae\") " Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.606787 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-utilities" (OuterVolumeSpecName: "utilities") pod "17758ae4-e064-49a9-99a2-e374e8c3ecae" (UID: "17758ae4-e064-49a9-99a2-e374e8c3ecae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.610728 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17758ae4-e064-49a9-99a2-e374e8c3ecae-kube-api-access-mwl9c" (OuterVolumeSpecName: "kube-api-access-mwl9c") pod "17758ae4-e064-49a9-99a2-e374e8c3ecae" (UID: "17758ae4-e064-49a9-99a2-e374e8c3ecae"). InnerVolumeSpecName "kube-api-access-mwl9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.658472 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17758ae4-e064-49a9-99a2-e374e8c3ecae" (UID: "17758ae4-e064-49a9-99a2-e374e8c3ecae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.706922 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwl9c\" (UniqueName: \"kubernetes.io/projected/17758ae4-e064-49a9-99a2-e374e8c3ecae-kube-api-access-mwl9c\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.706950 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:27 crc kubenswrapper[4904]: I1205 20:54:27.706960 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17758ae4-e064-49a9-99a2-e374e8c3ecae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.084769 4904 generic.go:334] "Generic (PLEG): container finished" podID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerID="e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f" exitCode=0 Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.084816 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerDied","Data":"e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f"} Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.084832 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4hsh" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.084849 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4hsh" event={"ID":"17758ae4-e064-49a9-99a2-e374e8c3ecae","Type":"ContainerDied","Data":"3b29ffa20c3b2f8b42bb384fb1a636df2d4b87ec3a0a63b152abda12bb9ca9d4"} Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.084870 4904 scope.go:117] "RemoveContainer" containerID="e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.117757 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4hsh"] Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.120305 4904 scope.go:117] "RemoveContainer" containerID="5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.138019 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s4hsh"] Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.146603 4904 scope.go:117] "RemoveContainer" containerID="dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.188763 4904 scope.go:117] "RemoveContainer" containerID="e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f" Dec 05 20:54:28 crc kubenswrapper[4904]: E1205 20:54:28.189087 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f\": container with ID starting with e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f not found: ID does not exist" containerID="e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.189119 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f"} err="failed to get container status \"e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f\": rpc error: code = NotFound desc = could not find container \"e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f\": container with ID starting with e244874d2c9525164f2e503073fdcfe37afd68ccfc43717032d9423b15761d1f not found: ID does not exist" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.189138 4904 scope.go:117] "RemoveContainer" containerID="5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86" Dec 05 20:54:28 crc kubenswrapper[4904]: E1205 20:54:28.189318 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86\": container with ID starting with 5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86 not found: ID does not exist" containerID="5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.189337 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86"} err="failed to get container status \"5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86\": rpc error: code = NotFound desc = could not find container \"5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86\": container with ID starting with 5dc3f89f56fe46af1abb4338a621a838f57efdf9aedf7ebef6cc8875e0d3de86 not found: ID does not exist" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.189349 4904 scope.go:117] "RemoveContainer" containerID="dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72" Dec 05 20:54:28 crc kubenswrapper[4904]: E1205 20:54:28.189494 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72\": container with ID starting with dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72 not found: ID does not exist" containerID="dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72" Dec 05 20:54:28 crc kubenswrapper[4904]: I1205 20:54:28.189513 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72"} err="failed to get container status \"dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72\": rpc error: code = NotFound desc = could not find container \"dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72\": container with ID starting with dba9eb62a33acdc84445f9794287e7958e78a573192d06eb2fb2f921cebc0c72 not found: ID does not exist" Dec 05 20:54:29 crc kubenswrapper[4904]: I1205 20:54:29.683134 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:54:29 crc kubenswrapper[4904]: E1205 20:54:29.683538 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:54:29 crc kubenswrapper[4904]: I1205 20:54:29.695850 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" path="/var/lib/kubelet/pods/17758ae4-e064-49a9-99a2-e374e8c3ecae/volumes" Dec 05 20:54:40 crc kubenswrapper[4904]: I1205 20:54:40.216499 4904 generic.go:334] "Generic (PLEG): container finished" podID="11479a0b-d4c6-4770-bdf1-dbb8a417384d" containerID="4be5d33997d22dbadb8341100525fe1d9962d44704e93cce783007d63640d6c8" exitCode=0 Dec 05 20:54:40 crc kubenswrapper[4904]: I1205 20:54:40.217291 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" event={"ID":"11479a0b-d4c6-4770-bdf1-dbb8a417384d","Type":"ContainerDied","Data":"4be5d33997d22dbadb8341100525fe1d9962d44704e93cce783007d63640d6c8"} Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.646940 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.679435 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vltxg\" (UniqueName: \"kubernetes.io/projected/11479a0b-d4c6-4770-bdf1-dbb8a417384d-kube-api-access-vltxg\") pod \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.679520 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-secret-0\") pod \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.679547 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-inventory\") pod \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.679634 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-combined-ca-bundle\") pod \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.679768 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-ssh-key\") pod \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\" (UID: \"11479a0b-d4c6-4770-bdf1-dbb8a417384d\") " Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.720034 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "11479a0b-d4c6-4770-bdf1-dbb8a417384d" (UID: "11479a0b-d4c6-4770-bdf1-dbb8a417384d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.720374 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11479a0b-d4c6-4770-bdf1-dbb8a417384d-kube-api-access-vltxg" (OuterVolumeSpecName: "kube-api-access-vltxg") pod "11479a0b-d4c6-4770-bdf1-dbb8a417384d" (UID: "11479a0b-d4c6-4770-bdf1-dbb8a417384d"). InnerVolumeSpecName "kube-api-access-vltxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.742902 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-inventory" (OuterVolumeSpecName: "inventory") pod "11479a0b-d4c6-4770-bdf1-dbb8a417384d" (UID: "11479a0b-d4c6-4770-bdf1-dbb8a417384d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.755390 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11479a0b-d4c6-4770-bdf1-dbb8a417384d" (UID: "11479a0b-d4c6-4770-bdf1-dbb8a417384d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.757963 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "11479a0b-d4c6-4770-bdf1-dbb8a417384d" (UID: "11479a0b-d4c6-4770-bdf1-dbb8a417384d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.782360 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.782401 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vltxg\" (UniqueName: \"kubernetes.io/projected/11479a0b-d4c6-4770-bdf1-dbb8a417384d-kube-api-access-vltxg\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.782418 4904 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.782429 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:41 crc kubenswrapper[4904]: I1205 20:54:41.782440 4904 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11479a0b-d4c6-4770-bdf1-dbb8a417384d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.242610 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" event={"ID":"11479a0b-d4c6-4770-bdf1-dbb8a417384d","Type":"ContainerDied","Data":"95d40f408d0a3ee2dcf5d86ddc03e7917d1d4cf7c795adafe890d2edf811ad7f"} Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.242680 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d40f408d0a3ee2dcf5d86ddc03e7917d1d4cf7c795adafe890d2edf811ad7f" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.242708 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5548g" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350020 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb"] Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350441 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11479a0b-d4c6-4770-bdf1-dbb8a417384d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350460 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="11479a0b-d4c6-4770-bdf1-dbb8a417384d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350474 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="extract-content" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350481 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="extract-content" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350501 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="extract-content" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350507 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="extract-content" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350517 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="registry-server" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350523 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="registry-server" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350534 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="extract-utilities" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350540 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="extract-utilities" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350547 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="registry-server" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350553 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="registry-server" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.350569 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="extract-utilities" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350576 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="extract-utilities" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350759 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="712f269c-ea33-41c9-84f8-1d998d4ee803" containerName="registry-server" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350784 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="11479a0b-d4c6-4770-bdf1-dbb8a417384d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.350800 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="17758ae4-e064-49a9-99a2-e374e8c3ecae" containerName="registry-server" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.352526 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.356656 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.356686 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.357312 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.358624 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.358663 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.358759 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.359646 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.360135 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb"] Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397155 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397218 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397254 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397309 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397371 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdfsm\" (UniqueName: \"kubernetes.io/projected/d9ee9825-e991-495c-bbd8-30ae1e7b0780-kube-api-access-kdfsm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397403 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397507 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397531 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.397569 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499539 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499595 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499638 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499706 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499742 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499769 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499812 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499858 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdfsm\" (UniqueName: \"kubernetes.io/projected/d9ee9825-e991-495c-bbd8-30ae1e7b0780-kube-api-access-kdfsm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.499888 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.500886 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.504184 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.504265 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.505049 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.507998 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.508774 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.510316 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.517235 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.521522 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdfsm\" (UniqueName: \"kubernetes.io/projected/d9ee9825-e991-495c-bbd8-30ae1e7b0780-kube-api-access-kdfsm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6jljb\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.674264 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:54:42 crc kubenswrapper[4904]: I1205 20:54:42.680972 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:54:42 crc kubenswrapper[4904]: E1205 20:54:42.681248 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:54:43 crc kubenswrapper[4904]: I1205 20:54:43.250460 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb"] Dec 05 20:54:44 crc kubenswrapper[4904]: I1205 20:54:44.264477 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" event={"ID":"d9ee9825-e991-495c-bbd8-30ae1e7b0780","Type":"ContainerStarted","Data":"a21fe932805e4f37e6123125133228e2f097603c69596260dfc918b77c45e3ed"} Dec 05 20:54:44 crc kubenswrapper[4904]: I1205 20:54:44.265019 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" event={"ID":"d9ee9825-e991-495c-bbd8-30ae1e7b0780","Type":"ContainerStarted","Data":"b99870a7f8e71cf7dadd49cb2341bf55e1e38e8c7d179631d8b65b9a1467b563"} Dec 05 20:54:44 crc kubenswrapper[4904]: I1205 20:54:44.292337 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" podStartSLOduration=1.8154448680000002 podStartE2EDuration="2.292301507s" podCreationTimestamp="2025-12-05 20:54:42 +0000 UTC" firstStartedPulling="2025-12-05 20:54:43.255487098 +0000 UTC m=+2582.066703207" lastFinishedPulling="2025-12-05 20:54:43.732343737 +0000 UTC m=+2582.543559846" observedRunningTime="2025-12-05 20:54:44.283545638 +0000 UTC m=+2583.094761777" watchObservedRunningTime="2025-12-05 20:54:44.292301507 +0000 UTC m=+2583.103517616" Dec 05 20:54:54 crc kubenswrapper[4904]: I1205 20:54:54.681642 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:54:54 crc kubenswrapper[4904]: E1205 20:54:54.682392 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:55:09 crc kubenswrapper[4904]: I1205 20:55:09.681396 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:55:09 crc kubenswrapper[4904]: E1205 20:55:09.682121 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:55:19 crc kubenswrapper[4904]: I1205 20:55:19.881221 4904 scope.go:117] "RemoveContainer" containerID="f3c6176c475b35c1a89f4731e726c54ee27ea483ea37a147d963d3e72f45446e" Dec 05 20:55:19 crc kubenswrapper[4904]: I1205 20:55:19.914607 4904 scope.go:117] "RemoveContainer" containerID="8ea1e2452f9fef0584898ad4783e67f53fbfdf6e779e1dc3c629880d1655a136" Dec 05 20:55:19 crc kubenswrapper[4904]: I1205 20:55:19.960116 4904 scope.go:117] "RemoveContainer" containerID="46627ceb535cd7523b84ad827915b0feff2570d96a7fe4c633b4669a061c42d2" Dec 05 20:55:23 crc kubenswrapper[4904]: I1205 20:55:23.681644 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:55:23 crc kubenswrapper[4904]: E1205 20:55:23.682466 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:55:38 crc kubenswrapper[4904]: I1205 20:55:38.681801 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:55:38 crc kubenswrapper[4904]: E1205 20:55:38.683453 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:55:52 crc kubenswrapper[4904]: I1205 20:55:52.681717 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:55:52 crc kubenswrapper[4904]: E1205 20:55:52.682694 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:56:07 crc kubenswrapper[4904]: I1205 20:56:07.681688 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:56:07 crc kubenswrapper[4904]: E1205 20:56:07.682421 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:56:20 crc kubenswrapper[4904]: I1205 20:56:20.681125 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:56:20 crc kubenswrapper[4904]: E1205 20:56:20.681917 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:56:32 crc kubenswrapper[4904]: I1205 20:56:32.681521 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:56:32 crc kubenswrapper[4904]: E1205 20:56:32.682272 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:56:44 crc kubenswrapper[4904]: I1205 20:56:44.682633 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:56:44 crc kubenswrapper[4904]: E1205 20:56:44.683743 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:56:57 crc kubenswrapper[4904]: I1205 20:56:57.683978 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:56:57 crc kubenswrapper[4904]: E1205 20:56:57.685103 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 20:57:09 crc kubenswrapper[4904]: I1205 20:57:09.682136 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 20:57:10 crc kubenswrapper[4904]: I1205 20:57:10.782406 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"d929e21438f177c4fed1d386cdca4a13bf9f262bc1b1ea9e1ccc79cc3095e408"} Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.762513 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wkmj2"] Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.765651 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.803000 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkmj2"] Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.852593 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-utilities\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.852651 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-catalog-content\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.852679 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z76lt\" (UniqueName: \"kubernetes.io/projected/909889c0-d778-4876-a91b-59fd617e23bd-kube-api-access-z76lt\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.954521 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-utilities\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.954564 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-catalog-content\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.954582 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z76lt\" (UniqueName: \"kubernetes.io/projected/909889c0-d778-4876-a91b-59fd617e23bd-kube-api-access-z76lt\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.955379 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-catalog-content\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.955378 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-utilities\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:16 crc kubenswrapper[4904]: I1205 20:57:16.975825 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z76lt\" (UniqueName: \"kubernetes.io/projected/909889c0-d778-4876-a91b-59fd617e23bd-kube-api-access-z76lt\") pod \"redhat-operators-wkmj2\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:17 crc kubenswrapper[4904]: I1205 20:57:17.088398 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:17 crc kubenswrapper[4904]: I1205 20:57:17.656162 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkmj2"] Dec 05 20:57:17 crc kubenswrapper[4904]: I1205 20:57:17.851660 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerStarted","Data":"b4f4308dac3705eba4d9a155c85e15b319e2a0aae147d2ee8ec97ac41697e09c"} Dec 05 20:57:18 crc kubenswrapper[4904]: I1205 20:57:18.864238 4904 generic.go:334] "Generic (PLEG): container finished" podID="909889c0-d778-4876-a91b-59fd617e23bd" containerID="68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba" exitCode=0 Dec 05 20:57:18 crc kubenswrapper[4904]: I1205 20:57:18.864290 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerDied","Data":"68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba"} Dec 05 20:57:19 crc kubenswrapper[4904]: I1205 20:57:19.876142 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerStarted","Data":"246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818"} Dec 05 20:57:23 crc kubenswrapper[4904]: I1205 20:57:23.920521 4904 generic.go:334] "Generic (PLEG): container finished" podID="909889c0-d778-4876-a91b-59fd617e23bd" containerID="246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818" exitCode=0 Dec 05 20:57:23 crc kubenswrapper[4904]: I1205 20:57:23.920630 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerDied","Data":"246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818"} Dec 05 20:57:24 crc kubenswrapper[4904]: I1205 20:57:24.937486 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerStarted","Data":"93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b"} Dec 05 20:57:24 crc kubenswrapper[4904]: I1205 20:57:24.978297 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wkmj2" podStartSLOduration=3.542216688 podStartE2EDuration="8.978281566s" podCreationTimestamp="2025-12-05 20:57:16 +0000 UTC" firstStartedPulling="2025-12-05 20:57:18.870399701 +0000 UTC m=+2737.681615810" lastFinishedPulling="2025-12-05 20:57:24.306464579 +0000 UTC m=+2743.117680688" observedRunningTime="2025-12-05 20:57:24.973516507 +0000 UTC m=+2743.784732626" watchObservedRunningTime="2025-12-05 20:57:24.978281566 +0000 UTC m=+2743.789497675" Dec 05 20:57:27 crc kubenswrapper[4904]: I1205 20:57:27.088830 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:27 crc kubenswrapper[4904]: I1205 20:57:27.089237 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:28 crc kubenswrapper[4904]: I1205 20:57:28.151031 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wkmj2" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="registry-server" probeResult="failure" output=< Dec 05 20:57:28 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 20:57:28 crc kubenswrapper[4904]: > Dec 05 20:57:37 crc kubenswrapper[4904]: I1205 20:57:37.137549 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:37 crc kubenswrapper[4904]: I1205 20:57:37.195620 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:37 crc kubenswrapper[4904]: I1205 20:57:37.380576 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkmj2"] Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.084757 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wkmj2" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="registry-server" containerID="cri-o://93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b" gracePeriod=2 Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.539868 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.712208 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-utilities\") pod \"909889c0-d778-4876-a91b-59fd617e23bd\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.712342 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-catalog-content\") pod \"909889c0-d778-4876-a91b-59fd617e23bd\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.712531 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z76lt\" (UniqueName: \"kubernetes.io/projected/909889c0-d778-4876-a91b-59fd617e23bd-kube-api-access-z76lt\") pod \"909889c0-d778-4876-a91b-59fd617e23bd\" (UID: \"909889c0-d778-4876-a91b-59fd617e23bd\") " Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.713034 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-utilities" (OuterVolumeSpecName: "utilities") pod "909889c0-d778-4876-a91b-59fd617e23bd" (UID: "909889c0-d778-4876-a91b-59fd617e23bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.718944 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909889c0-d778-4876-a91b-59fd617e23bd-kube-api-access-z76lt" (OuterVolumeSpecName: "kube-api-access-z76lt") pod "909889c0-d778-4876-a91b-59fd617e23bd" (UID: "909889c0-d778-4876-a91b-59fd617e23bd"). InnerVolumeSpecName "kube-api-access-z76lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.814946 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z76lt\" (UniqueName: \"kubernetes.io/projected/909889c0-d778-4876-a91b-59fd617e23bd-kube-api-access-z76lt\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.815236 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.831620 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "909889c0-d778-4876-a91b-59fd617e23bd" (UID: "909889c0-d778-4876-a91b-59fd617e23bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:39 crc kubenswrapper[4904]: I1205 20:57:39.916785 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909889c0-d778-4876-a91b-59fd617e23bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.096339 4904 generic.go:334] "Generic (PLEG): container finished" podID="909889c0-d778-4876-a91b-59fd617e23bd" containerID="93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b" exitCode=0 Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.096386 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerDied","Data":"93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b"} Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.096432 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkmj2" event={"ID":"909889c0-d778-4876-a91b-59fd617e23bd","Type":"ContainerDied","Data":"b4f4308dac3705eba4d9a155c85e15b319e2a0aae147d2ee8ec97ac41697e09c"} Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.096450 4904 scope.go:117] "RemoveContainer" containerID="93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.096459 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkmj2" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.118620 4904 scope.go:117] "RemoveContainer" containerID="246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.135032 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkmj2"] Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.144305 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wkmj2"] Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.156999 4904 scope.go:117] "RemoveContainer" containerID="68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.189960 4904 scope.go:117] "RemoveContainer" containerID="93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b" Dec 05 20:57:40 crc kubenswrapper[4904]: E1205 20:57:40.190634 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b\": container with ID starting with 93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b not found: ID does not exist" containerID="93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.190669 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b"} err="failed to get container status \"93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b\": rpc error: code = NotFound desc = could not find container \"93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b\": container with ID starting with 93850ab423a49f1329c70a01ba100ad009cc00febd207e93e88ac2feeb202f2b not found: ID does not exist" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.190691 4904 scope.go:117] "RemoveContainer" containerID="246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818" Dec 05 20:57:40 crc kubenswrapper[4904]: E1205 20:57:40.191035 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818\": container with ID starting with 246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818 not found: ID does not exist" containerID="246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.191053 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818"} err="failed to get container status \"246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818\": rpc error: code = NotFound desc = could not find container \"246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818\": container with ID starting with 246c58e712404d974bd5120c16b4cdf16b6780e4d278cadacab96942c12f6818 not found: ID does not exist" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.191092 4904 scope.go:117] "RemoveContainer" containerID="68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba" Dec 05 20:57:40 crc kubenswrapper[4904]: E1205 20:57:40.191390 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba\": container with ID starting with 68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba not found: ID does not exist" containerID="68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba" Dec 05 20:57:40 crc kubenswrapper[4904]: I1205 20:57:40.191414 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba"} err="failed to get container status \"68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba\": rpc error: code = NotFound desc = could not find container \"68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba\": container with ID starting with 68c14d8d80be59321e0d7ef23a5b97b0931f54b9498a696337b3512a9bc49dba not found: ID does not exist" Dec 05 20:57:41 crc kubenswrapper[4904]: I1205 20:57:41.697905 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909889c0-d778-4876-a91b-59fd617e23bd" path="/var/lib/kubelet/pods/909889c0-d778-4876-a91b-59fd617e23bd/volumes" Dec 05 20:57:51 crc kubenswrapper[4904]: I1205 20:57:51.207706 4904 generic.go:334] "Generic (PLEG): container finished" podID="d9ee9825-e991-495c-bbd8-30ae1e7b0780" containerID="a21fe932805e4f37e6123125133228e2f097603c69596260dfc918b77c45e3ed" exitCode=0 Dec 05 20:57:51 crc kubenswrapper[4904]: I1205 20:57:51.207846 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" event={"ID":"d9ee9825-e991-495c-bbd8-30ae1e7b0780","Type":"ContainerDied","Data":"a21fe932805e4f37e6123125133228e2f097603c69596260dfc918b77c45e3ed"} Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.642229 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.793578 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-extra-config-0\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.793749 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-inventory\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.793861 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-1\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.793909 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-0\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.793953 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-0\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.794008 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-ssh-key\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.794079 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-1\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.794097 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdfsm\" (UniqueName: \"kubernetes.io/projected/d9ee9825-e991-495c-bbd8-30ae1e7b0780-kube-api-access-kdfsm\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.794132 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-combined-ca-bundle\") pod \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\" (UID: \"d9ee9825-e991-495c-bbd8-30ae1e7b0780\") " Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.801028 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ee9825-e991-495c-bbd8-30ae1e7b0780-kube-api-access-kdfsm" (OuterVolumeSpecName: "kube-api-access-kdfsm") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "kube-api-access-kdfsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.801043 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.826546 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.828087 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-inventory" (OuterVolumeSpecName: "inventory") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.828994 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.830339 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.841833 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.841902 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.852900 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d9ee9825-e991-495c-bbd8-30ae1e7b0780" (UID: "d9ee9825-e991-495c-bbd8-30ae1e7b0780"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897564 4904 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897605 4904 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897616 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897625 4904 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897635 4904 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897644 4904 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897653 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897747 4904 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9ee9825-e991-495c-bbd8-30ae1e7b0780-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:52 crc kubenswrapper[4904]: I1205 20:57:52.897759 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdfsm\" (UniqueName: \"kubernetes.io/projected/d9ee9825-e991-495c-bbd8-30ae1e7b0780-kube-api-access-kdfsm\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.230194 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" event={"ID":"d9ee9825-e991-495c-bbd8-30ae1e7b0780","Type":"ContainerDied","Data":"b99870a7f8e71cf7dadd49cb2341bf55e1e38e8c7d179631d8b65b9a1467b563"} Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.230253 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b99870a7f8e71cf7dadd49cb2341bf55e1e38e8c7d179631d8b65b9a1467b563" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.230288 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6jljb" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.331357 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k"] Dec 05 20:57:53 crc kubenswrapper[4904]: E1205 20:57:53.331937 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="extract-utilities" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.331968 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="extract-utilities" Dec 05 20:57:53 crc kubenswrapper[4904]: E1205 20:57:53.332001 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ee9825-e991-495c-bbd8-30ae1e7b0780" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.332016 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ee9825-e991-495c-bbd8-30ae1e7b0780" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 20:57:53 crc kubenswrapper[4904]: E1205 20:57:53.332039 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="registry-server" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.332051 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="registry-server" Dec 05 20:57:53 crc kubenswrapper[4904]: E1205 20:57:53.332141 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="extract-content" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.332153 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="extract-content" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.332462 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="909889c0-d778-4876-a91b-59fd617e23bd" containerName="registry-server" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.332515 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ee9825-e991-495c-bbd8-30ae1e7b0780" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.333718 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.337320 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.337555 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.338273 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.338675 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.339212 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pf9pw" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.349133 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k"] Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507527 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507609 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ct9l\" (UniqueName: \"kubernetes.io/projected/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-kube-api-access-7ct9l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507666 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507716 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507747 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507840 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.507886 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610429 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610524 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ct9l\" (UniqueName: \"kubernetes.io/projected/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-kube-api-access-7ct9l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610585 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610641 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610665 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610733 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.610760 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.615890 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.616766 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.617342 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.618632 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.620648 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.620840 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.630031 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ct9l\" (UniqueName: \"kubernetes.io/projected/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-kube-api-access-7ct9l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:53 crc kubenswrapper[4904]: I1205 20:57:53.654123 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 20:57:54 crc kubenswrapper[4904]: I1205 20:57:54.248284 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k"] Dec 05 20:57:55 crc kubenswrapper[4904]: I1205 20:57:55.249494 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" event={"ID":"adfcfa36-4dfd-422c-b1a5-3a2e342ea208","Type":"ContainerStarted","Data":"4b72bc1b1ab76573e7db7539d0f3c457f459b2e20eac1db357caead0fcc6ee4a"} Dec 05 20:57:55 crc kubenswrapper[4904]: I1205 20:57:55.249778 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" event={"ID":"adfcfa36-4dfd-422c-b1a5-3a2e342ea208","Type":"ContainerStarted","Data":"a008c557ee3b9766cd6e584f87b94949d5e1dce46d77913af97e3149cc38fbd2"} Dec 05 20:57:55 crc kubenswrapper[4904]: I1205 20:57:55.276434 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" podStartSLOduration=1.807558936 podStartE2EDuration="2.276417018s" podCreationTimestamp="2025-12-05 20:57:53 +0000 UTC" firstStartedPulling="2025-12-05 20:57:54.248507727 +0000 UTC m=+2773.059723836" lastFinishedPulling="2025-12-05 20:57:54.717365789 +0000 UTC m=+2773.528581918" observedRunningTime="2025-12-05 20:57:55.267710681 +0000 UTC m=+2774.078926790" watchObservedRunningTime="2025-12-05 20:57:55.276417018 +0000 UTC m=+2774.087633127" Dec 05 20:59:29 crc kubenswrapper[4904]: I1205 20:59:29.955861 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:59:29 crc kubenswrapper[4904]: I1205 20:59:29.956390 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:59:59 crc kubenswrapper[4904]: I1205 20:59:59.957254 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:59:59 crc kubenswrapper[4904]: I1205 20:59:59.957858 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.158940 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4"] Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.161097 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.163980 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.167108 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.177100 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4"] Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.263514 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-config-volume\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.263607 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbjt5\" (UniqueName: \"kubernetes.io/projected/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-kube-api-access-tbjt5\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.263719 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-secret-volume\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.365687 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbjt5\" (UniqueName: \"kubernetes.io/projected/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-kube-api-access-tbjt5\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.365797 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-secret-volume\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.365975 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-config-volume\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.367095 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-config-volume\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.375097 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-secret-volume\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.387972 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbjt5\" (UniqueName: \"kubernetes.io/projected/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-kube-api-access-tbjt5\") pod \"collect-profiles-29416140-kqqf4\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.483156 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:00 crc kubenswrapper[4904]: I1205 21:00:00.927403 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4"] Dec 05 21:00:00 crc kubenswrapper[4904]: W1205 21:00:00.935365 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43cc048_7fbb_482a_9ffd_b0ce8c7df3d0.slice/crio-09d5361d4bf51294c9731693de47e7612f15a9a1f0fa668b23271519644ad9c4 WatchSource:0}: Error finding container 09d5361d4bf51294c9731693de47e7612f15a9a1f0fa668b23271519644ad9c4: Status 404 returned error can't find the container with id 09d5361d4bf51294c9731693de47e7612f15a9a1f0fa668b23271519644ad9c4 Dec 05 21:00:01 crc kubenswrapper[4904]: I1205 21:00:01.524860 4904 generic.go:334] "Generic (PLEG): container finished" podID="d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" containerID="a2f3f42c506991ec0e78577b8e40e1ba8b240ab8631d1fa0816cb8800d2faedf" exitCode=0 Dec 05 21:00:01 crc kubenswrapper[4904]: I1205 21:00:01.524911 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" event={"ID":"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0","Type":"ContainerDied","Data":"a2f3f42c506991ec0e78577b8e40e1ba8b240ab8631d1fa0816cb8800d2faedf"} Dec 05 21:00:01 crc kubenswrapper[4904]: I1205 21:00:01.524990 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" event={"ID":"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0","Type":"ContainerStarted","Data":"09d5361d4bf51294c9731693de47e7612f15a9a1f0fa668b23271519644ad9c4"} Dec 05 21:00:02 crc kubenswrapper[4904]: I1205 21:00:02.881482 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.021171 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-secret-volume\") pod \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.021364 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-config-volume\") pod \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.021535 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbjt5\" (UniqueName: \"kubernetes.io/projected/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-kube-api-access-tbjt5\") pod \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\" (UID: \"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0\") " Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.023635 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" (UID: "d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.029223 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" (UID: "d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.030390 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-kube-api-access-tbjt5" (OuterVolumeSpecName: "kube-api-access-tbjt5") pod "d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" (UID: "d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0"). InnerVolumeSpecName "kube-api-access-tbjt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.123693 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.123728 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.123739 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbjt5\" (UniqueName: \"kubernetes.io/projected/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0-kube-api-access-tbjt5\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.548675 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" event={"ID":"d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0","Type":"ContainerDied","Data":"09d5361d4bf51294c9731693de47e7612f15a9a1f0fa668b23271519644ad9c4"} Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.548742 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d5361d4bf51294c9731693de47e7612f15a9a1f0fa668b23271519644ad9c4" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.548783 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4" Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.959584 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx"] Dec 05 21:00:03 crc kubenswrapper[4904]: I1205 21:00:03.970563 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-g7dzx"] Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.634358 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p5m9t"] Dec 05 21:00:05 crc kubenswrapper[4904]: E1205 21:00:05.635103 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" containerName="collect-profiles" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.635120 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" containerName="collect-profiles" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.635308 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" containerName="collect-profiles" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.636826 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.644614 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5m9t"] Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.693946 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28256b97-09a9-4ddf-a730-d2fc4d926310" path="/var/lib/kubelet/pods/28256b97-09a9-4ddf-a730-d2fc4d926310/volumes" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.780792 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-catalog-content\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.780892 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-utilities\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.781016 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n627w\" (UniqueName: \"kubernetes.io/projected/84d152ab-956d-4172-897f-f51dd957e2de-kube-api-access-n627w\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.884592 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-catalog-content\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.884665 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-utilities\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.884738 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n627w\" (UniqueName: \"kubernetes.io/projected/84d152ab-956d-4172-897f-f51dd957e2de-kube-api-access-n627w\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.885797 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-catalog-content\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.886044 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-utilities\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.903972 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n627w\" (UniqueName: \"kubernetes.io/projected/84d152ab-956d-4172-897f-f51dd957e2de-kube-api-access-n627w\") pod \"community-operators-p5m9t\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:05 crc kubenswrapper[4904]: I1205 21:00:05.962276 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:06 crc kubenswrapper[4904]: I1205 21:00:06.522093 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5m9t"] Dec 05 21:00:06 crc kubenswrapper[4904]: I1205 21:00:06.576514 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerStarted","Data":"2c85be7549e1614f90ba5c4c38235bf722f4e63235a0b2f2ea49188be21e866d"} Dec 05 21:00:07 crc kubenswrapper[4904]: I1205 21:00:07.589944 4904 generic.go:334] "Generic (PLEG): container finished" podID="84d152ab-956d-4172-897f-f51dd957e2de" containerID="98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca" exitCode=0 Dec 05 21:00:07 crc kubenswrapper[4904]: I1205 21:00:07.590135 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerDied","Data":"98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca"} Dec 05 21:00:07 crc kubenswrapper[4904]: I1205 21:00:07.592664 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:00:08 crc kubenswrapper[4904]: I1205 21:00:08.604283 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerStarted","Data":"d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0"} Dec 05 21:00:09 crc kubenswrapper[4904]: I1205 21:00:09.617736 4904 generic.go:334] "Generic (PLEG): container finished" podID="84d152ab-956d-4172-897f-f51dd957e2de" containerID="d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0" exitCode=0 Dec 05 21:00:09 crc kubenswrapper[4904]: I1205 21:00:09.618034 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerDied","Data":"d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0"} Dec 05 21:00:10 crc kubenswrapper[4904]: I1205 21:00:10.629303 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerStarted","Data":"01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7"} Dec 05 21:00:10 crc kubenswrapper[4904]: I1205 21:00:10.651575 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p5m9t" podStartSLOduration=3.220178684 podStartE2EDuration="5.651557001s" podCreationTimestamp="2025-12-05 21:00:05 +0000 UTC" firstStartedPulling="2025-12-05 21:00:07.592383985 +0000 UTC m=+2906.403600104" lastFinishedPulling="2025-12-05 21:00:10.023762312 +0000 UTC m=+2908.834978421" observedRunningTime="2025-12-05 21:00:10.647752158 +0000 UTC m=+2909.458968287" watchObservedRunningTime="2025-12-05 21:00:10.651557001 +0000 UTC m=+2909.462773110" Dec 05 21:00:15 crc kubenswrapper[4904]: I1205 21:00:15.962755 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:15 crc kubenswrapper[4904]: I1205 21:00:15.963553 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:16 crc kubenswrapper[4904]: I1205 21:00:16.058314 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:16 crc kubenswrapper[4904]: I1205 21:00:16.736470 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:16 crc kubenswrapper[4904]: I1205 21:00:16.799241 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5m9t"] Dec 05 21:00:17 crc kubenswrapper[4904]: I1205 21:00:17.730326 4904 generic.go:334] "Generic (PLEG): container finished" podID="adfcfa36-4dfd-422c-b1a5-3a2e342ea208" containerID="4b72bc1b1ab76573e7db7539d0f3c457f459b2e20eac1db357caead0fcc6ee4a" exitCode=0 Dec 05 21:00:17 crc kubenswrapper[4904]: I1205 21:00:17.746046 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" event={"ID":"adfcfa36-4dfd-422c-b1a5-3a2e342ea208","Type":"ContainerDied","Data":"4b72bc1b1ab76573e7db7539d0f3c457f459b2e20eac1db357caead0fcc6ee4a"} Dec 05 21:00:18 crc kubenswrapper[4904]: I1205 21:00:18.740668 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p5m9t" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="registry-server" containerID="cri-o://01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7" gracePeriod=2 Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.172797 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263097 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-telemetry-combined-ca-bundle\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263327 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-2\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263412 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-1\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263465 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-inventory\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263549 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ssh-key\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263591 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-0\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.263735 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ct9l\" (UniqueName: \"kubernetes.io/projected/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-kube-api-access-7ct9l\") pod \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\" (UID: \"adfcfa36-4dfd-422c-b1a5-3a2e342ea208\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.276205 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-kube-api-access-7ct9l" (OuterVolumeSpecName: "kube-api-access-7ct9l") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "kube-api-access-7ct9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.292631 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.299035 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.302387 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-inventory" (OuterVolumeSpecName: "inventory") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.312010 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.313948 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.314234 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "adfcfa36-4dfd-422c-b1a5-3a2e342ea208" (UID: "adfcfa36-4dfd-422c-b1a5-3a2e342ea208"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366201 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ct9l\" (UniqueName: \"kubernetes.io/projected/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-kube-api-access-7ct9l\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366236 4904 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366247 4904 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366257 4904 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366269 4904 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366278 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.366286 4904 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/adfcfa36-4dfd-422c-b1a5-3a2e342ea208-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.600490 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.670825 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-catalog-content\") pod \"84d152ab-956d-4172-897f-f51dd957e2de\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.671114 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n627w\" (UniqueName: \"kubernetes.io/projected/84d152ab-956d-4172-897f-f51dd957e2de-kube-api-access-n627w\") pod \"84d152ab-956d-4172-897f-f51dd957e2de\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.671166 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-utilities\") pod \"84d152ab-956d-4172-897f-f51dd957e2de\" (UID: \"84d152ab-956d-4172-897f-f51dd957e2de\") " Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.671918 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-utilities" (OuterVolumeSpecName: "utilities") pod "84d152ab-956d-4172-897f-f51dd957e2de" (UID: "84d152ab-956d-4172-897f-f51dd957e2de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.677283 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d152ab-956d-4172-897f-f51dd957e2de-kube-api-access-n627w" (OuterVolumeSpecName: "kube-api-access-n627w") pod "84d152ab-956d-4172-897f-f51dd957e2de" (UID: "84d152ab-956d-4172-897f-f51dd957e2de"). InnerVolumeSpecName "kube-api-access-n627w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.722734 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84d152ab-956d-4172-897f-f51dd957e2de" (UID: "84d152ab-956d-4172-897f-f51dd957e2de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.751619 4904 generic.go:334] "Generic (PLEG): container finished" podID="84d152ab-956d-4172-897f-f51dd957e2de" containerID="01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.751673 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerDied","Data":"01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7"} Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.751696 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5m9t" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.751716 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5m9t" event={"ID":"84d152ab-956d-4172-897f-f51dd957e2de","Type":"ContainerDied","Data":"2c85be7549e1614f90ba5c4c38235bf722f4e63235a0b2f2ea49188be21e866d"} Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.751770 4904 scope.go:117] "RemoveContainer" containerID="01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.754531 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" event={"ID":"adfcfa36-4dfd-422c-b1a5-3a2e342ea208","Type":"ContainerDied","Data":"a008c557ee3b9766cd6e584f87b94949d5e1dce46d77913af97e3149cc38fbd2"} Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.754555 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a008c557ee3b9766cd6e584f87b94949d5e1dce46d77913af97e3149cc38fbd2" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.754603 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.774598 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n627w\" (UniqueName: \"kubernetes.io/projected/84d152ab-956d-4172-897f-f51dd957e2de-kube-api-access-n627w\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.775334 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.775349 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d152ab-956d-4172-897f-f51dd957e2de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.785513 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5m9t"] Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.794221 4904 scope.go:117] "RemoveContainer" containerID="d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.795117 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p5m9t"] Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.818032 4904 scope.go:117] "RemoveContainer" containerID="98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.838304 4904 scope.go:117] "RemoveContainer" containerID="01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7" Dec 05 21:00:19 crc kubenswrapper[4904]: E1205 21:00:19.838756 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7\": container with ID starting with 01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7 not found: ID does not exist" containerID="01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.838795 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7"} err="failed to get container status \"01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7\": rpc error: code = NotFound desc = could not find container \"01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7\": container with ID starting with 01414db944baa6048b9a7d7a112438627af4fcf11060c66f9d270415e8ae4dd7 not found: ID does not exist" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.838820 4904 scope.go:117] "RemoveContainer" containerID="d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0" Dec 05 21:00:19 crc kubenswrapper[4904]: E1205 21:00:19.839263 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0\": container with ID starting with d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0 not found: ID does not exist" containerID="d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.839298 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0"} err="failed to get container status \"d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0\": rpc error: code = NotFound desc = could not find container \"d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0\": container with ID starting with d02f703889d363e72373c8fb4a124547465c83c6d211a4fa85a098f493634df0 not found: ID does not exist" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.839320 4904 scope.go:117] "RemoveContainer" containerID="98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca" Dec 05 21:00:19 crc kubenswrapper[4904]: E1205 21:00:19.839929 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca\": container with ID starting with 98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca not found: ID does not exist" containerID="98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca" Dec 05 21:00:19 crc kubenswrapper[4904]: I1205 21:00:19.839981 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca"} err="failed to get container status \"98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca\": rpc error: code = NotFound desc = could not find container \"98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca\": container with ID starting with 98399cdba85eba10f33212811bebc69a4fbf0c9d135d9e021b17cc130d926eca not found: ID does not exist" Dec 05 21:00:20 crc kubenswrapper[4904]: I1205 21:00:20.158044 4904 scope.go:117] "RemoveContainer" containerID="343df926b25a4e8e58ab4ee6a22a996cc3bc6c6988a24dc4b0ef6591839eaebe" Dec 05 21:00:21 crc kubenswrapper[4904]: I1205 21:00:21.696526 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d152ab-956d-4172-897f-f51dd957e2de" path="/var/lib/kubelet/pods/84d152ab-956d-4172-897f-f51dd957e2de/volumes" Dec 05 21:00:29 crc kubenswrapper[4904]: I1205 21:00:29.956036 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:00:29 crc kubenswrapper[4904]: I1205 21:00:29.956758 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:00:29 crc kubenswrapper[4904]: I1205 21:00:29.956837 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:00:29 crc kubenswrapper[4904]: I1205 21:00:29.958134 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d929e21438f177c4fed1d386cdca4a13bf9f262bc1b1ea9e1ccc79cc3095e408"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:00:29 crc kubenswrapper[4904]: I1205 21:00:29.958247 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://d929e21438f177c4fed1d386cdca4a13bf9f262bc1b1ea9e1ccc79cc3095e408" gracePeriod=600 Dec 05 21:00:30 crc kubenswrapper[4904]: I1205 21:00:30.870338 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="d929e21438f177c4fed1d386cdca4a13bf9f262bc1b1ea9e1ccc79cc3095e408" exitCode=0 Dec 05 21:00:30 crc kubenswrapper[4904]: I1205 21:00:30.870409 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"d929e21438f177c4fed1d386cdca4a13bf9f262bc1b1ea9e1ccc79cc3095e408"} Dec 05 21:00:30 crc kubenswrapper[4904]: I1205 21:00:30.870819 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e"} Dec 05 21:00:30 crc kubenswrapper[4904]: I1205 21:00:30.870841 4904 scope.go:117] "RemoveContainer" containerID="3ff9529892815f6974fa95e3944e54016af616270f5e88c8ba81e4e82b52d881" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.068918 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 05 21:00:55 crc kubenswrapper[4904]: E1205 21:00:55.069804 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="registry-server" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.069817 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="registry-server" Dec 05 21:00:55 crc kubenswrapper[4904]: E1205 21:00:55.069841 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfcfa36-4dfd-422c-b1a5-3a2e342ea208" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.069848 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfcfa36-4dfd-422c-b1a5-3a2e342ea208" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 21:00:55 crc kubenswrapper[4904]: E1205 21:00:55.069856 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="extract-utilities" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.069862 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="extract-utilities" Dec 05 21:00:55 crc kubenswrapper[4904]: E1205 21:00:55.069881 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="extract-content" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.069886 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="extract-content" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.070071 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d152ab-956d-4172-897f-f51dd957e2de" containerName="registry-server" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.070081 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfcfa36-4dfd-422c-b1a5-3a2e342ea208" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.071092 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.074997 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.082986 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.168635 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.170952 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.173436 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.179306 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.238167 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.239916 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241163 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241220 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-scripts\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241240 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241259 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241295 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-lib-modules\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241317 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-run\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241334 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241358 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241376 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-sys\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241419 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-config-data\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241447 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-dev\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241476 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6868\" (UniqueName: \"kubernetes.io/projected/cdca5ac3-4ef8-43e3-8244-438e43e029c4-kube-api-access-w6868\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241520 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241555 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.241652 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.243223 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.260855 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.344019 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.344105 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.344132 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.344172 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.344199 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtjkf\" (UniqueName: \"kubernetes.io/projected/b2237f58-91be-4eae-9feb-94feacffd4a6-kube-api-access-wtjkf\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.344466 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345399 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-scripts\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345443 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345485 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345535 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-run\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345572 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345604 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345648 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345690 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345719 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345753 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-lib-modules\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345854 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-dev\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345884 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-run\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345910 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345944 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.345985 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346015 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-sys\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346045 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346104 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346130 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346183 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346209 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-config-data\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346232 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346257 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346319 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-dev\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346360 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-run\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346398 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346424 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-dev\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346444 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-lib-modules\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346506 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346533 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-sys\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346527 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346605 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346750 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6868\" (UniqueName: \"kubernetes.io/projected/cdca5ac3-4ef8-43e3-8244-438e43e029c4-kube-api-access-w6868\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346780 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346797 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346819 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346870 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346903 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.346919 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-sys\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347003 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347093 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347110 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347126 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347152 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpdqs\" (UniqueName: \"kubernetes.io/projected/b3d4e158-45e3-4448-ad27-36e4aa3cb002-kube-api-access-vpdqs\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347181 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347219 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347627 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.347730 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cdca5ac3-4ef8-43e3-8244-438e43e029c4-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.352647 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-scripts\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.354259 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.357910 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-config-data\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.358666 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdca5ac3-4ef8-43e3-8244-438e43e029c4-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.366706 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6868\" (UniqueName: \"kubernetes.io/projected/cdca5ac3-4ef8-43e3-8244-438e43e029c4-kube-api-access-w6868\") pod \"cinder-backup-0\" (UID: \"cdca5ac3-4ef8-43e3-8244-438e43e029c4\") " pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.403824 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.448891 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.448954 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-dev\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.448983 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449010 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449036 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449053 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449099 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449118 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449133 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449159 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449187 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449201 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449226 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449245 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-sys\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449261 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449285 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449299 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449318 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpdqs\" (UniqueName: \"kubernetes.io/projected/b3d4e158-45e3-4448-ad27-36e4aa3cb002-kube-api-access-vpdqs\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449333 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449355 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449380 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449395 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449420 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449437 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtjkf\" (UniqueName: \"kubernetes.io/projected/b2237f58-91be-4eae-9feb-94feacffd4a6-kube-api-access-wtjkf\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449454 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449471 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-run\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449490 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449538 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449557 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449573 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449657 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449690 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449726 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449754 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.449848 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-dev\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.450186 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.450954 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451078 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451129 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451139 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451165 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-sys\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451166 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451183 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451214 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451241 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451262 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451283 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-run\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451282 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451328 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b2237f58-91be-4eae-9feb-94feacffd4a6-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.451533 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b3d4e158-45e3-4448-ad27-36e4aa3cb002-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.454243 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.457122 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.457454 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.457990 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.458153 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.458690 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2237f58-91be-4eae-9feb-94feacffd4a6-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.464704 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.470531 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3d4e158-45e3-4448-ad27-36e4aa3cb002-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.471535 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpdqs\" (UniqueName: \"kubernetes.io/projected/b3d4e158-45e3-4448-ad27-36e4aa3cb002-kube-api-access-vpdqs\") pod \"cinder-volume-nfs-2-0\" (UID: \"b3d4e158-45e3-4448-ad27-36e4aa3cb002\") " pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.472616 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtjkf\" (UniqueName: \"kubernetes.io/projected/b2237f58-91be-4eae-9feb-94feacffd4a6-kube-api-access-wtjkf\") pod \"cinder-volume-nfs-0\" (UID: \"b2237f58-91be-4eae-9feb-94feacffd4a6\") " pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.505400 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.569719 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:00:55 crc kubenswrapper[4904]: I1205 21:00:55.994699 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 05 21:00:56 crc kubenswrapper[4904]: I1205 21:00:56.211742 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cdca5ac3-4ef8-43e3-8244-438e43e029c4","Type":"ContainerStarted","Data":"feb7d31601d818ac89a0f8b055adf8f5602ff2ca2a4d38ac74c25c6ea296e769"} Dec 05 21:00:56 crc kubenswrapper[4904]: I1205 21:00:56.219876 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 05 21:00:56 crc kubenswrapper[4904]: W1205 21:00:56.228607 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2237f58_91be_4eae_9feb_94feacffd4a6.slice/crio-54614e93f16a887e7e14c530fd5abfdffc5a0a06e3860cf7becd4621153dabe1 WatchSource:0}: Error finding container 54614e93f16a887e7e14c530fd5abfdffc5a0a06e3860cf7becd4621153dabe1: Status 404 returned error can't find the container with id 54614e93f16a887e7e14c530fd5abfdffc5a0a06e3860cf7becd4621153dabe1 Dec 05 21:00:56 crc kubenswrapper[4904]: I1205 21:00:56.306783 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 05 21:00:56 crc kubenswrapper[4904]: W1205 21:00:56.311109 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d4e158_45e3_4448_ad27_36e4aa3cb002.slice/crio-933735df6f1dd8018eee32a5024c191a4892d16566003a92fa730c54e3a888f7 WatchSource:0}: Error finding container 933735df6f1dd8018eee32a5024c191a4892d16566003a92fa730c54e3a888f7: Status 404 returned error can't find the container with id 933735df6f1dd8018eee32a5024c191a4892d16566003a92fa730c54e3a888f7 Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.224099 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"b2237f58-91be-4eae-9feb-94feacffd4a6","Type":"ContainerStarted","Data":"ec899b6eb6deaa1dbfe7ce49c6bea976e7feda6289e08ca2ef44bb8c48ec7659"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.224606 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"b2237f58-91be-4eae-9feb-94feacffd4a6","Type":"ContainerStarted","Data":"dd8aa26862901602c204be029ecf7577e6b517113fdb599333983082baea327f"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.224623 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"b2237f58-91be-4eae-9feb-94feacffd4a6","Type":"ContainerStarted","Data":"54614e93f16a887e7e14c530fd5abfdffc5a0a06e3860cf7becd4621153dabe1"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.226138 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"b3d4e158-45e3-4448-ad27-36e4aa3cb002","Type":"ContainerStarted","Data":"b84246ba7e98fe7e1b8ae90b023bb5c6a2dd7d8e583fb8dcfad0b74c6671eb95"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.226175 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"b3d4e158-45e3-4448-ad27-36e4aa3cb002","Type":"ContainerStarted","Data":"2f7e1bc009efeabcae18e8f519475153e43ba4ac014eeeca03f3451a91de24c8"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.226190 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"b3d4e158-45e3-4448-ad27-36e4aa3cb002","Type":"ContainerStarted","Data":"933735df6f1dd8018eee32a5024c191a4892d16566003a92fa730c54e3a888f7"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.229636 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cdca5ac3-4ef8-43e3-8244-438e43e029c4","Type":"ContainerStarted","Data":"cd84f88cd4bc5ecf2e7e65ef10494052ab90364d5c86e5b4c8df0bb617ccf027"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.229671 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cdca5ac3-4ef8-43e3-8244-438e43e029c4","Type":"ContainerStarted","Data":"936013cfb1ce88b2404a41d745c0b479c36f90397a9f09b5028219161d506ca8"} Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.254796 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=1.866482447 podStartE2EDuration="2.254763764s" podCreationTimestamp="2025-12-05 21:00:55 +0000 UTC" firstStartedPulling="2025-12-05 21:00:56.230958947 +0000 UTC m=+2955.042175056" lastFinishedPulling="2025-12-05 21:00:56.619240264 +0000 UTC m=+2955.430456373" observedRunningTime="2025-12-05 21:00:57.247234781 +0000 UTC m=+2956.058450890" watchObservedRunningTime="2025-12-05 21:00:57.254763764 +0000 UTC m=+2956.065979873" Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.284313 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=1.979189246 podStartE2EDuration="2.28428981s" podCreationTimestamp="2025-12-05 21:00:55 +0000 UTC" firstStartedPulling="2025-12-05 21:00:56.314114219 +0000 UTC m=+2955.125330328" lastFinishedPulling="2025-12-05 21:00:56.619214783 +0000 UTC m=+2955.430430892" observedRunningTime="2025-12-05 21:00:57.272243085 +0000 UTC m=+2956.083459214" watchObservedRunningTime="2025-12-05 21:00:57.28428981 +0000 UTC m=+2956.095505929" Dec 05 21:00:57 crc kubenswrapper[4904]: I1205 21:00:57.301263 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.130203456 podStartE2EDuration="2.301243056s" podCreationTimestamp="2025-12-05 21:00:55 +0000 UTC" firstStartedPulling="2025-12-05 21:00:56.00030741 +0000 UTC m=+2954.811523549" lastFinishedPulling="2025-12-05 21:00:56.17134704 +0000 UTC m=+2954.982563149" observedRunningTime="2025-12-05 21:00:57.296378576 +0000 UTC m=+2956.107594695" watchObservedRunningTime="2025-12-05 21:00:57.301243056 +0000 UTC m=+2956.112459175" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.146242 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416141-5f2p7"] Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.148030 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.161577 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416141-5f2p7"] Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.264759 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-fernet-keys\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.264983 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcdw5\" (UniqueName: \"kubernetes.io/projected/c392ddc8-695b-4543-a7f4-05ad75ff272b-kube-api-access-bcdw5\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.265019 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-config-data\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.265240 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-combined-ca-bundle\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.367434 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-fernet-keys\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.367575 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcdw5\" (UniqueName: \"kubernetes.io/projected/c392ddc8-695b-4543-a7f4-05ad75ff272b-kube-api-access-bcdw5\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.367606 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-config-data\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.367708 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-combined-ca-bundle\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.382299 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-fernet-keys\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.382345 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-combined-ca-bundle\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.382748 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-config-data\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.394977 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcdw5\" (UniqueName: \"kubernetes.io/projected/c392ddc8-695b-4543-a7f4-05ad75ff272b-kube-api-access-bcdw5\") pod \"keystone-cron-29416141-5f2p7\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.404920 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.482758 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.509414 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Dec 05 21:01:00 crc kubenswrapper[4904]: I1205 21:01:00.572101 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:01:01 crc kubenswrapper[4904]: I1205 21:01:01.033422 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416141-5f2p7"] Dec 05 21:01:01 crc kubenswrapper[4904]: I1205 21:01:01.274280 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-5f2p7" event={"ID":"c392ddc8-695b-4543-a7f4-05ad75ff272b","Type":"ContainerStarted","Data":"95980dec1e44557d784b18be4f3d9781c15d45d215e2005b2dcaa7b621618f2e"} Dec 05 21:01:01 crc kubenswrapper[4904]: I1205 21:01:01.274316 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-5f2p7" event={"ID":"c392ddc8-695b-4543-a7f4-05ad75ff272b","Type":"ContainerStarted","Data":"14a0de2209af9142d7ca9a25b6ca1867eae3cfbe467b94f191fe9ca3c9be429b"} Dec 05 21:01:01 crc kubenswrapper[4904]: I1205 21:01:01.293343 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416141-5f2p7" podStartSLOduration=1.293322804 podStartE2EDuration="1.293322804s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:01.288321549 +0000 UTC m=+2960.099537678" watchObservedRunningTime="2025-12-05 21:01:01.293322804 +0000 UTC m=+2960.104538933" Dec 05 21:01:05 crc kubenswrapper[4904]: I1205 21:01:05.366574 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-5f2p7" event={"ID":"c392ddc8-695b-4543-a7f4-05ad75ff272b","Type":"ContainerDied","Data":"95980dec1e44557d784b18be4f3d9781c15d45d215e2005b2dcaa7b621618f2e"} Dec 05 21:01:05 crc kubenswrapper[4904]: I1205 21:01:05.366533 4904 generic.go:334] "Generic (PLEG): container finished" podID="c392ddc8-695b-4543-a7f4-05ad75ff272b" containerID="95980dec1e44557d784b18be4f3d9781c15d45d215e2005b2dcaa7b621618f2e" exitCode=0 Dec 05 21:01:05 crc kubenswrapper[4904]: I1205 21:01:05.645647 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 05 21:01:05 crc kubenswrapper[4904]: I1205 21:01:05.774423 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Dec 05 21:01:05 crc kubenswrapper[4904]: I1205 21:01:05.861204 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Dec 05 21:01:06 crc kubenswrapper[4904]: I1205 21:01:06.961355 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.153255 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-fernet-keys\") pod \"c392ddc8-695b-4543-a7f4-05ad75ff272b\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.153289 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-config-data\") pod \"c392ddc8-695b-4543-a7f4-05ad75ff272b\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.153419 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcdw5\" (UniqueName: \"kubernetes.io/projected/c392ddc8-695b-4543-a7f4-05ad75ff272b-kube-api-access-bcdw5\") pod \"c392ddc8-695b-4543-a7f4-05ad75ff272b\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.153456 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-combined-ca-bundle\") pod \"c392ddc8-695b-4543-a7f4-05ad75ff272b\" (UID: \"c392ddc8-695b-4543-a7f4-05ad75ff272b\") " Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.180698 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c392ddc8-695b-4543-a7f4-05ad75ff272b" (UID: "c392ddc8-695b-4543-a7f4-05ad75ff272b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.186449 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c392ddc8-695b-4543-a7f4-05ad75ff272b-kube-api-access-bcdw5" (OuterVolumeSpecName: "kube-api-access-bcdw5") pod "c392ddc8-695b-4543-a7f4-05ad75ff272b" (UID: "c392ddc8-695b-4543-a7f4-05ad75ff272b"). InnerVolumeSpecName "kube-api-access-bcdw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.230574 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-config-data" (OuterVolumeSpecName: "config-data") pod "c392ddc8-695b-4543-a7f4-05ad75ff272b" (UID: "c392ddc8-695b-4543-a7f4-05ad75ff272b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.238670 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c392ddc8-695b-4543-a7f4-05ad75ff272b" (UID: "c392ddc8-695b-4543-a7f4-05ad75ff272b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.256865 4904 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.256901 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.256911 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcdw5\" (UniqueName: \"kubernetes.io/projected/c392ddc8-695b-4543-a7f4-05ad75ff272b-kube-api-access-bcdw5\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.256922 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c392ddc8-695b-4543-a7f4-05ad75ff272b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.408670 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-5f2p7" event={"ID":"c392ddc8-695b-4543-a7f4-05ad75ff272b","Type":"ContainerDied","Data":"14a0de2209af9142d7ca9a25b6ca1867eae3cfbe467b94f191fe9ca3c9be429b"} Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.408747 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a0de2209af9142d7ca9a25b6ca1867eae3cfbe467b94f191fe9ca3c9be429b" Dec 05 21:01:07 crc kubenswrapper[4904]: I1205 21:01:07.408863 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-5f2p7" Dec 05 21:01:58 crc kubenswrapper[4904]: I1205 21:01:58.169357 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 21:01:58 crc kubenswrapper[4904]: I1205 21:01:58.176011 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="prometheus" containerID="cri-o://62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8" gracePeriod=600 Dec 05 21:01:58 crc kubenswrapper[4904]: I1205 21:01:58.176396 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="thanos-sidecar" containerID="cri-o://57dbf94a2709259dac03b7ed9a3a2a2403a945d2f55b8f46f07fa8bc146febbf" gracePeriod=600 Dec 05 21:01:58 crc kubenswrapper[4904]: I1205 21:01:58.176557 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="config-reloader" containerID="cri-o://7cbc7bacd9571c9f748aea81fd8c39c95848ff4193b76c21641d0a7598e7c4e1" gracePeriod=600 Dec 05 21:01:58 crc kubenswrapper[4904]: E1205 21:01:58.470295 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99273c11_759d_4f83_a065_bf008d6a110f.slice/crio-conmon-62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99273c11_759d_4f83_a065_bf008d6a110f.slice/crio-62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.033844 4904 generic.go:334] "Generic (PLEG): container finished" podID="99273c11-759d-4f83-a065-bf008d6a110f" containerID="57dbf94a2709259dac03b7ed9a3a2a2403a945d2f55b8f46f07fa8bc146febbf" exitCode=0 Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.034118 4904 generic.go:334] "Generic (PLEG): container finished" podID="99273c11-759d-4f83-a065-bf008d6a110f" containerID="7cbc7bacd9571c9f748aea81fd8c39c95848ff4193b76c21641d0a7598e7c4e1" exitCode=0 Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.034128 4904 generic.go:334] "Generic (PLEG): container finished" podID="99273c11-759d-4f83-a065-bf008d6a110f" containerID="62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8" exitCode=0 Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.033901 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerDied","Data":"57dbf94a2709259dac03b7ed9a3a2a2403a945d2f55b8f46f07fa8bc146febbf"} Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.034165 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerDied","Data":"7cbc7bacd9571c9f748aea81fd8c39c95848ff4193b76c21641d0a7598e7c4e1"} Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.034180 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerDied","Data":"62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8"} Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.161732 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.248355 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.248555 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xdfl\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-kube-api-access-9xdfl\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.248675 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-config\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.248932 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-secret-combined-ca-bundle\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.248985 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-tls-assets\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.249053 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99273c11-759d-4f83-a065-bf008d6a110f-prometheus-metric-storage-rulefiles-0\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.249218 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.249302 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99273c11-759d-4f83-a065-bf008d6a110f-config-out\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.249357 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-thanos-prometheus-http-client-file\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.249975 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99273c11-759d-4f83-a065-bf008d6a110f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.250832 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.250868 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config\") pod \"99273c11-759d-4f83-a065-bf008d6a110f\" (UID: \"99273c11-759d-4f83-a065-bf008d6a110f\") " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.251737 4904 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99273c11-759d-4f83-a065-bf008d6a110f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.256078 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-kube-api-access-9xdfl" (OuterVolumeSpecName: "kube-api-access-9xdfl") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "kube-api-access-9xdfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.256042 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.256817 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.257318 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-config" (OuterVolumeSpecName: "config") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.264044 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99273c11-759d-4f83-a065-bf008d6a110f-config-out" (OuterVolumeSpecName: "config-out") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.264104 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.272360 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.279272 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.288266 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "pvc-b90305fc-6b85-4b0e-958e-59ae1d530558". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.340010 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config" (OuterVolumeSpecName: "web-config") pod "99273c11-759d-4f83-a065-bf008d6a110f" (UID: "99273c11-759d-4f83-a065-bf008d6a110f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353217 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") on node \"crc\" " Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353245 4904 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353256 4904 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353268 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xdfl\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-kube-api-access-9xdfl\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353277 4904 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353285 4904 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353294 4904 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99273c11-759d-4f83-a065-bf008d6a110f-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353303 4904 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353313 4904 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99273c11-759d-4f83-a065-bf008d6a110f-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.353322 4904 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99273c11-759d-4f83-a065-bf008d6a110f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.382017 4904 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.382580 4904 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b90305fc-6b85-4b0e-958e-59ae1d530558" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558") on node "crc" Dec 05 21:01:59 crc kubenswrapper[4904]: I1205 21:01:59.455088 4904 reconciler_common.go:293] "Volume detached for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.044306 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99273c11-759d-4f83-a065-bf008d6a110f","Type":"ContainerDied","Data":"994c214c3ff01658a0503405b872de1da98a41dd643b14998fe3391201add387"} Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.044389 4904 scope.go:117] "RemoveContainer" containerID="57dbf94a2709259dac03b7ed9a3a2a2403a945d2f55b8f46f07fa8bc146febbf" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.044527 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.069298 4904 scope.go:117] "RemoveContainer" containerID="7cbc7bacd9571c9f748aea81fd8c39c95848ff4193b76c21641d0a7598e7c4e1" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.073283 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.088829 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.105150 4904 scope.go:117] "RemoveContainer" containerID="62b153cfe8e2648d9a8acf0e163629b0c1ab91fdeb7886ad25610a75e9f4efc8" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.112983 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 21:02:00 crc kubenswrapper[4904]: E1205 21:02:00.113941 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="config-reloader" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.113963 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="config-reloader" Dec 05 21:02:00 crc kubenswrapper[4904]: E1205 21:02:00.113978 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="thanos-sidecar" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.113985 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="thanos-sidecar" Dec 05 21:02:00 crc kubenswrapper[4904]: E1205 21:02:00.114019 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="prometheus" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114025 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="prometheus" Dec 05 21:02:00 crc kubenswrapper[4904]: E1205 21:02:00.114034 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="init-config-reloader" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114039 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="init-config-reloader" Dec 05 21:02:00 crc kubenswrapper[4904]: E1205 21:02:00.114049 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c392ddc8-695b-4543-a7f4-05ad75ff272b" containerName="keystone-cron" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114069 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c392ddc8-695b-4543-a7f4-05ad75ff272b" containerName="keystone-cron" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114249 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="config-reloader" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114270 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="prometheus" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114288 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c392ddc8-695b-4543-a7f4-05ad75ff272b" containerName="keystone-cron" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.114299 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="99273c11-759d-4f83-a065-bf008d6a110f" containerName="thanos-sidecar" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.116351 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.119238 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.119289 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.119342 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.119388 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.121029 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-c2mnb" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.129094 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.135311 4904 scope.go:117] "RemoveContainer" containerID="e3cd937f13d5690f46fe2b55e7bc6ccedc15fe2bd1e6fbaec5c9f90263b1a790" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.136110 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272287 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272351 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272423 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272473 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272511 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272557 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272614 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272641 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272671 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272720 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chcll\" (UniqueName: \"kubernetes.io/projected/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-kube-api-access-chcll\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.272787 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375165 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375271 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375314 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375373 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375441 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375466 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375499 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375545 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chcll\" (UniqueName: \"kubernetes.io/projected/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-kube-api-access-chcll\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375578 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375821 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.375843 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.382560 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.383197 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.386396 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.387115 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.387121 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.388015 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.389305 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.393860 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.402548 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.407821 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chcll\" (UniqueName: \"kubernetes.io/projected/3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0-kube-api-access-chcll\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.426750 4904 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.426802 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c2522b7d526507cd9f6194376dadc5aee47822a6206b438630738242eaba537/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.478954 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b90305fc-6b85-4b0e-958e-59ae1d530558\") pod \"prometheus-metric-storage-0\" (UID: \"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0\") " pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.488485 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:00 crc kubenswrapper[4904]: I1205 21:02:00.999159 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 21:02:01 crc kubenswrapper[4904]: I1205 21:02:01.056347 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0","Type":"ContainerStarted","Data":"7a26268cec580206fa138cdf5d911a18706c43c72fae1c6a245b926336e7654b"} Dec 05 21:02:01 crc kubenswrapper[4904]: I1205 21:02:01.694794 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99273c11-759d-4f83-a065-bf008d6a110f" path="/var/lib/kubelet/pods/99273c11-759d-4f83-a065-bf008d6a110f/volumes" Dec 05 21:02:05 crc kubenswrapper[4904]: I1205 21:02:05.096717 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0","Type":"ContainerStarted","Data":"3cf340281cc495d8d84027610f3a79dc5e2c5ccc4ec18836146409c6209ec72b"} Dec 05 21:02:11 crc kubenswrapper[4904]: I1205 21:02:11.179824 4904 generic.go:334] "Generic (PLEG): container finished" podID="3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0" containerID="3cf340281cc495d8d84027610f3a79dc5e2c5ccc4ec18836146409c6209ec72b" exitCode=0 Dec 05 21:02:11 crc kubenswrapper[4904]: I1205 21:02:11.179871 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0","Type":"ContainerDied","Data":"3cf340281cc495d8d84027610f3a79dc5e2c5ccc4ec18836146409c6209ec72b"} Dec 05 21:02:12 crc kubenswrapper[4904]: I1205 21:02:12.204823 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0","Type":"ContainerStarted","Data":"e67a8b01c8cf1a964c47e342571b6bc3b285d2373ef857dde15c5ea572719549"} Dec 05 21:02:15 crc kubenswrapper[4904]: I1205 21:02:15.237551 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0","Type":"ContainerStarted","Data":"e696470dd3b8aa1dc166a8b816b6b9495ac24f08cc71a9e5b6d021d4871e4f82"} Dec 05 21:02:15 crc kubenswrapper[4904]: I1205 21:02:15.237943 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0","Type":"ContainerStarted","Data":"0b5f0abfad9b73bd3f68d8909690fb8aeb15d00421d16989de2a88ccf058a71a"} Dec 05 21:02:15 crc kubenswrapper[4904]: I1205 21:02:15.281973 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.281949734 podStartE2EDuration="15.281949734s" podCreationTimestamp="2025-12-05 21:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:15.275815839 +0000 UTC m=+3034.087031968" watchObservedRunningTime="2025-12-05 21:02:15.281949734 +0000 UTC m=+3034.093165843" Dec 05 21:02:15 crc kubenswrapper[4904]: I1205 21:02:15.489292 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:15 crc kubenswrapper[4904]: I1205 21:02:15.490352 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:15 crc kubenswrapper[4904]: I1205 21:02:15.495268 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:16 crc kubenswrapper[4904]: I1205 21:02:16.251333 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.486776 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.490114 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.495427 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.495526 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.495740 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b9jhq" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.495815 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.503601 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.568394 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-config-data\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.568794 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.568892 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671459 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671572 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671620 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671652 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-config-data\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671677 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671697 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vpb\" (UniqueName: \"kubernetes.io/projected/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-kube-api-access-c2vpb\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671849 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.671975 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.672124 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.673167 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-config-data\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.673537 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.680828 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.773736 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.773868 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.773922 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.773967 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.773992 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vpb\" (UniqueName: \"kubernetes.io/projected/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-kube-api-access-c2vpb\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.774074 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.774568 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.774589 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.775198 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.777575 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.779441 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.799241 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vpb\" (UniqueName: \"kubernetes.io/projected/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-kube-api-access-c2vpb\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.817152 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " pod="openstack/tempest-tests-tempest" Dec 05 21:02:42 crc kubenswrapper[4904]: I1205 21:02:42.830739 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 21:02:43 crc kubenswrapper[4904]: I1205 21:02:43.453917 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 21:02:43 crc kubenswrapper[4904]: I1205 21:02:43.541251 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"45f624ec-9d5e-41f1-ba5b-e81c2b84c532","Type":"ContainerStarted","Data":"95d01893c247b7588d110f72f471658397e76f54aa022d4ef163fd026e76eb90"} Dec 05 21:02:56 crc kubenswrapper[4904]: I1205 21:02:56.679679 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"45f624ec-9d5e-41f1-ba5b-e81c2b84c532","Type":"ContainerStarted","Data":"a13d80b221473979d8dc16d7bd38389b4b352cf98a7b74615c1fc547f1dda233"} Dec 05 21:02:56 crc kubenswrapper[4904]: I1205 21:02:56.701840 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.344516889 podStartE2EDuration="15.701804824s" podCreationTimestamp="2025-12-05 21:02:41 +0000 UTC" firstStartedPulling="2025-12-05 21:02:43.466341843 +0000 UTC m=+3062.277557952" lastFinishedPulling="2025-12-05 21:02:54.823629768 +0000 UTC m=+3073.634845887" observedRunningTime="2025-12-05 21:02:56.698729911 +0000 UTC m=+3075.509946070" watchObservedRunningTime="2025-12-05 21:02:56.701804824 +0000 UTC m=+3075.513020953" Dec 05 21:02:59 crc kubenswrapper[4904]: I1205 21:02:59.956304 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:02:59 crc kubenswrapper[4904]: I1205 21:02:59.956854 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:03:29 crc kubenswrapper[4904]: I1205 21:03:29.955761 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:03:29 crc kubenswrapper[4904]: I1205 21:03:29.956383 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:03:59 crc kubenswrapper[4904]: I1205 21:03:59.956169 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:03:59 crc kubenswrapper[4904]: I1205 21:03:59.956835 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:03:59 crc kubenswrapper[4904]: I1205 21:03:59.956905 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:03:59 crc kubenswrapper[4904]: I1205 21:03:59.958025 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:03:59 crc kubenswrapper[4904]: I1205 21:03:59.958157 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" gracePeriod=600 Dec 05 21:04:00 crc kubenswrapper[4904]: E1205 21:04:00.089706 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:04:00 crc kubenswrapper[4904]: I1205 21:04:00.423793 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" exitCode=0 Dec 05 21:04:00 crc kubenswrapper[4904]: I1205 21:04:00.423835 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e"} Dec 05 21:04:00 crc kubenswrapper[4904]: I1205 21:04:00.423884 4904 scope.go:117] "RemoveContainer" containerID="d929e21438f177c4fed1d386cdca4a13bf9f262bc1b1ea9e1ccc79cc3095e408" Dec 05 21:04:00 crc kubenswrapper[4904]: I1205 21:04:00.424649 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:04:00 crc kubenswrapper[4904]: E1205 21:04:00.424990 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.214418 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vgglk"] Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.217073 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.232917 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgglk"] Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.379770 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-utilities\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.380014 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swzcp\" (UniqueName: \"kubernetes.io/projected/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-kube-api-access-swzcp\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.380090 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-catalog-content\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.481498 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-utilities\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.481548 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swzcp\" (UniqueName: \"kubernetes.io/projected/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-kube-api-access-swzcp\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.481598 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-catalog-content\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.482223 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-catalog-content\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.482255 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-utilities\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.511626 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swzcp\" (UniqueName: \"kubernetes.io/projected/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-kube-api-access-swzcp\") pod \"redhat-marketplace-vgglk\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:07 crc kubenswrapper[4904]: I1205 21:04:07.535290 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:08 crc kubenswrapper[4904]: I1205 21:04:08.148016 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgglk"] Dec 05 21:04:08 crc kubenswrapper[4904]: I1205 21:04:08.510081 4904 generic.go:334] "Generic (PLEG): container finished" podID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerID="ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05" exitCode=0 Dec 05 21:04:08 crc kubenswrapper[4904]: I1205 21:04:08.510232 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerDied","Data":"ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05"} Dec 05 21:04:08 crc kubenswrapper[4904]: I1205 21:04:08.510689 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerStarted","Data":"a21a81b903292e5fdd3d8fb3b24dc9056e04e7bcd9ae072d453d5022a91b3c41"} Dec 05 21:04:09 crc kubenswrapper[4904]: I1205 21:04:09.522803 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerStarted","Data":"4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675"} Dec 05 21:04:10 crc kubenswrapper[4904]: I1205 21:04:10.582163 4904 generic.go:334] "Generic (PLEG): container finished" podID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerID="4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675" exitCode=0 Dec 05 21:04:10 crc kubenswrapper[4904]: I1205 21:04:10.582419 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerDied","Data":"4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675"} Dec 05 21:04:11 crc kubenswrapper[4904]: I1205 21:04:11.591798 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerStarted","Data":"12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc"} Dec 05 21:04:11 crc kubenswrapper[4904]: I1205 21:04:11.611428 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vgglk" podStartSLOduration=2.14515126 podStartE2EDuration="4.611407569s" podCreationTimestamp="2025-12-05 21:04:07 +0000 UTC" firstStartedPulling="2025-12-05 21:04:08.51184649 +0000 UTC m=+3147.323062599" lastFinishedPulling="2025-12-05 21:04:10.978102799 +0000 UTC m=+3149.789318908" observedRunningTime="2025-12-05 21:04:11.610757441 +0000 UTC m=+3150.421973570" watchObservedRunningTime="2025-12-05 21:04:11.611407569 +0000 UTC m=+3150.422623678" Dec 05 21:04:14 crc kubenswrapper[4904]: I1205 21:04:14.682044 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:04:14 crc kubenswrapper[4904]: E1205 21:04:14.682821 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:04:17 crc kubenswrapper[4904]: I1205 21:04:17.536142 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:17 crc kubenswrapper[4904]: I1205 21:04:17.536738 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:17 crc kubenswrapper[4904]: I1205 21:04:17.630162 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:17 crc kubenswrapper[4904]: I1205 21:04:17.739977 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:17 crc kubenswrapper[4904]: I1205 21:04:17.872040 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgglk"] Dec 05 21:04:19 crc kubenswrapper[4904]: I1205 21:04:19.668862 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vgglk" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="registry-server" containerID="cri-o://12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc" gracePeriod=2 Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.196154 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.386753 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-catalog-content\") pod \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.387364 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swzcp\" (UniqueName: \"kubernetes.io/projected/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-kube-api-access-swzcp\") pod \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.387417 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-utilities\") pod \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\" (UID: \"c4551fbc-2f74-4f56-a15d-976c65b5c9e1\") " Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.395681 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-kube-api-access-swzcp" (OuterVolumeSpecName: "kube-api-access-swzcp") pod "c4551fbc-2f74-4f56-a15d-976c65b5c9e1" (UID: "c4551fbc-2f74-4f56-a15d-976c65b5c9e1"). InnerVolumeSpecName "kube-api-access-swzcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.396146 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-utilities" (OuterVolumeSpecName: "utilities") pod "c4551fbc-2f74-4f56-a15d-976c65b5c9e1" (UID: "c4551fbc-2f74-4f56-a15d-976c65b5c9e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.415792 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4551fbc-2f74-4f56-a15d-976c65b5c9e1" (UID: "c4551fbc-2f74-4f56-a15d-976c65b5c9e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.490780 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swzcp\" (UniqueName: \"kubernetes.io/projected/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-kube-api-access-swzcp\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.490817 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.490830 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4551fbc-2f74-4f56-a15d-976c65b5c9e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.683080 4904 generic.go:334] "Generic (PLEG): container finished" podID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerID="12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc" exitCode=0 Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.683125 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerDied","Data":"12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc"} Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.683147 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgglk" event={"ID":"c4551fbc-2f74-4f56-a15d-976c65b5c9e1","Type":"ContainerDied","Data":"a21a81b903292e5fdd3d8fb3b24dc9056e04e7bcd9ae072d453d5022a91b3c41"} Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.683169 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgglk" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.683172 4904 scope.go:117] "RemoveContainer" containerID="12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.714817 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgglk"] Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.716253 4904 scope.go:117] "RemoveContainer" containerID="4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.723957 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgglk"] Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.756209 4904 scope.go:117] "RemoveContainer" containerID="ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.800152 4904 scope.go:117] "RemoveContainer" containerID="12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc" Dec 05 21:04:20 crc kubenswrapper[4904]: E1205 21:04:20.800584 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc\": container with ID starting with 12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc not found: ID does not exist" containerID="12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.800625 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc"} err="failed to get container status \"12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc\": rpc error: code = NotFound desc = could not find container \"12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc\": container with ID starting with 12f82a3d6c4453ad6ec77885f8dbba8238ce97a724b72148f03afce11c5f17fc not found: ID does not exist" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.800651 4904 scope.go:117] "RemoveContainer" containerID="4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675" Dec 05 21:04:20 crc kubenswrapper[4904]: E1205 21:04:20.801030 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675\": container with ID starting with 4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675 not found: ID does not exist" containerID="4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.801081 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675"} err="failed to get container status \"4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675\": rpc error: code = NotFound desc = could not find container \"4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675\": container with ID starting with 4f53c2f4b52afd7e6ad23cd68e4e43c5c26e3b40e6356bea720681a9d6e5f675 not found: ID does not exist" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.801108 4904 scope.go:117] "RemoveContainer" containerID="ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05" Dec 05 21:04:20 crc kubenswrapper[4904]: E1205 21:04:20.801371 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05\": container with ID starting with ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05 not found: ID does not exist" containerID="ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05" Dec 05 21:04:20 crc kubenswrapper[4904]: I1205 21:04:20.801400 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05"} err="failed to get container status \"ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05\": rpc error: code = NotFound desc = could not find container \"ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05\": container with ID starting with ce992963b11e9dfe05de5f5f3e8742e785eeee5c7807cdcf396fdc5b33c4ff05 not found: ID does not exist" Dec 05 21:04:21 crc kubenswrapper[4904]: I1205 21:04:21.704873 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" path="/var/lib/kubelet/pods/c4551fbc-2f74-4f56-a15d-976c65b5c9e1/volumes" Dec 05 21:04:26 crc kubenswrapper[4904]: I1205 21:04:26.681870 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:04:26 crc kubenswrapper[4904]: E1205 21:04:26.682475 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:04:38 crc kubenswrapper[4904]: I1205 21:04:38.681754 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:04:38 crc kubenswrapper[4904]: E1205 21:04:38.682539 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.799504 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9m7sm"] Dec 05 21:04:42 crc kubenswrapper[4904]: E1205 21:04:42.800791 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="registry-server" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.800808 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="registry-server" Dec 05 21:04:42 crc kubenswrapper[4904]: E1205 21:04:42.800833 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="extract-utilities" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.800839 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="extract-utilities" Dec 05 21:04:42 crc kubenswrapper[4904]: E1205 21:04:42.800865 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="extract-content" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.800871 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="extract-content" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.801178 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4551fbc-2f74-4f56-a15d-976c65b5c9e1" containerName="registry-server" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.804155 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.818306 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9m7sm"] Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.853630 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5tb5\" (UniqueName: \"kubernetes.io/projected/7a7aed77-9e72-4db6-b85f-07feb617feb6-kube-api-access-f5tb5\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.853799 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-catalog-content\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.854003 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-utilities\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.956090 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-utilities\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.956442 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5tb5\" (UniqueName: \"kubernetes.io/projected/7a7aed77-9e72-4db6-b85f-07feb617feb6-kube-api-access-f5tb5\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.956605 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-catalog-content\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.957306 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-catalog-content\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.957685 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-utilities\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:42 crc kubenswrapper[4904]: I1205 21:04:42.979561 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5tb5\" (UniqueName: \"kubernetes.io/projected/7a7aed77-9e72-4db6-b85f-07feb617feb6-kube-api-access-f5tb5\") pod \"certified-operators-9m7sm\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:43 crc kubenswrapper[4904]: I1205 21:04:43.138294 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:43 crc kubenswrapper[4904]: I1205 21:04:43.744404 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9m7sm"] Dec 05 21:04:43 crc kubenswrapper[4904]: I1205 21:04:43.935524 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerStarted","Data":"8edc4e3aebc1dd88fb6981f9a71e9e30ad12eb699efb2386367760ef7640b628"} Dec 05 21:04:44 crc kubenswrapper[4904]: I1205 21:04:44.946899 4904 generic.go:334] "Generic (PLEG): container finished" podID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerID="b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499" exitCode=0 Dec 05 21:04:44 crc kubenswrapper[4904]: I1205 21:04:44.946965 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerDied","Data":"b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499"} Dec 05 21:04:45 crc kubenswrapper[4904]: I1205 21:04:45.960593 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerStarted","Data":"ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc"} Dec 05 21:04:49 crc kubenswrapper[4904]: I1205 21:04:49.004472 4904 generic.go:334] "Generic (PLEG): container finished" podID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerID="ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc" exitCode=0 Dec 05 21:04:49 crc kubenswrapper[4904]: I1205 21:04:49.004559 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerDied","Data":"ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc"} Dec 05 21:04:50 crc kubenswrapper[4904]: I1205 21:04:50.018022 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerStarted","Data":"f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10"} Dec 05 21:04:50 crc kubenswrapper[4904]: I1205 21:04:50.050838 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9m7sm" podStartSLOduration=3.592697554 podStartE2EDuration="8.050772062s" podCreationTimestamp="2025-12-05 21:04:42 +0000 UTC" firstStartedPulling="2025-12-05 21:04:44.94960878 +0000 UTC m=+3183.760824909" lastFinishedPulling="2025-12-05 21:04:49.407683308 +0000 UTC m=+3188.218899417" observedRunningTime="2025-12-05 21:04:50.039390015 +0000 UTC m=+3188.850606154" watchObservedRunningTime="2025-12-05 21:04:50.050772062 +0000 UTC m=+3188.861988181" Dec 05 21:04:50 crc kubenswrapper[4904]: I1205 21:04:50.681718 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:04:50 crc kubenswrapper[4904]: E1205 21:04:50.682371 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:04:53 crc kubenswrapper[4904]: I1205 21:04:53.139020 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:53 crc kubenswrapper[4904]: I1205 21:04:53.141379 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:53 crc kubenswrapper[4904]: I1205 21:04:53.198836 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:54 crc kubenswrapper[4904]: I1205 21:04:54.112935 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:54 crc kubenswrapper[4904]: I1205 21:04:54.171540 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9m7sm"] Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.071517 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9m7sm" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="registry-server" containerID="cri-o://f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10" gracePeriod=2 Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.586256 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.607098 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-utilities\") pod \"7a7aed77-9e72-4db6-b85f-07feb617feb6\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.607293 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5tb5\" (UniqueName: \"kubernetes.io/projected/7a7aed77-9e72-4db6-b85f-07feb617feb6-kube-api-access-f5tb5\") pod \"7a7aed77-9e72-4db6-b85f-07feb617feb6\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.607338 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-catalog-content\") pod \"7a7aed77-9e72-4db6-b85f-07feb617feb6\" (UID: \"7a7aed77-9e72-4db6-b85f-07feb617feb6\") " Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.611008 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-utilities" (OuterVolumeSpecName: "utilities") pod "7a7aed77-9e72-4db6-b85f-07feb617feb6" (UID: "7a7aed77-9e72-4db6-b85f-07feb617feb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.632334 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7aed77-9e72-4db6-b85f-07feb617feb6-kube-api-access-f5tb5" (OuterVolumeSpecName: "kube-api-access-f5tb5") pod "7a7aed77-9e72-4db6-b85f-07feb617feb6" (UID: "7a7aed77-9e72-4db6-b85f-07feb617feb6"). InnerVolumeSpecName "kube-api-access-f5tb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.684873 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a7aed77-9e72-4db6-b85f-07feb617feb6" (UID: "7a7aed77-9e72-4db6-b85f-07feb617feb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.709798 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5tb5\" (UniqueName: \"kubernetes.io/projected/7a7aed77-9e72-4db6-b85f-07feb617feb6-kube-api-access-f5tb5\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.709840 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:56 crc kubenswrapper[4904]: I1205 21:04:56.709859 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7aed77-9e72-4db6-b85f-07feb617feb6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.086157 4904 generic.go:334] "Generic (PLEG): container finished" podID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerID="f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10" exitCode=0 Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.086294 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerDied","Data":"f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10"} Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.086325 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m7sm" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.086497 4904 scope.go:117] "RemoveContainer" containerID="f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.086481 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m7sm" event={"ID":"7a7aed77-9e72-4db6-b85f-07feb617feb6","Type":"ContainerDied","Data":"8edc4e3aebc1dd88fb6981f9a71e9e30ad12eb699efb2386367760ef7640b628"} Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.124531 4904 scope.go:117] "RemoveContainer" containerID="ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.132652 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9m7sm"] Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.162310 4904 scope.go:117] "RemoveContainer" containerID="b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.165075 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9m7sm"] Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.220701 4904 scope.go:117] "RemoveContainer" containerID="f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10" Dec 05 21:04:57 crc kubenswrapper[4904]: E1205 21:04:57.222541 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10\": container with ID starting with f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10 not found: ID does not exist" containerID="f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.222585 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10"} err="failed to get container status \"f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10\": rpc error: code = NotFound desc = could not find container \"f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10\": container with ID starting with f11ea9313996cf1a2c882a570b6e50c63f23bd2f8ac9a5da3a56abfb4ef8ad10 not found: ID does not exist" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.222609 4904 scope.go:117] "RemoveContainer" containerID="ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc" Dec 05 21:04:57 crc kubenswrapper[4904]: E1205 21:04:57.223016 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc\": container with ID starting with ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc not found: ID does not exist" containerID="ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.223109 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc"} err="failed to get container status \"ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc\": rpc error: code = NotFound desc = could not find container \"ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc\": container with ID starting with ea53e8fae421eb1cbac2d7b188e2013a7cc44e55ee5c1d9164d7b55567c2a1fc not found: ID does not exist" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.223145 4904 scope.go:117] "RemoveContainer" containerID="b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499" Dec 05 21:04:57 crc kubenswrapper[4904]: E1205 21:04:57.223461 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499\": container with ID starting with b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499 not found: ID does not exist" containerID="b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.223488 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499"} err="failed to get container status \"b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499\": rpc error: code = NotFound desc = could not find container \"b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499\": container with ID starting with b16d43e65d85aa9f4c5463c7f9508673c803b1d999436684311e66ab375aa499 not found: ID does not exist" Dec 05 21:04:57 crc kubenswrapper[4904]: I1205 21:04:57.701988 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" path="/var/lib/kubelet/pods/7a7aed77-9e72-4db6-b85f-07feb617feb6/volumes" Dec 05 21:05:03 crc kubenswrapper[4904]: I1205 21:05:03.681932 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:05:03 crc kubenswrapper[4904]: E1205 21:05:03.682672 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:05:18 crc kubenswrapper[4904]: I1205 21:05:18.682268 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:05:18 crc kubenswrapper[4904]: E1205 21:05:18.683109 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:05:29 crc kubenswrapper[4904]: I1205 21:05:29.681734 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:05:29 crc kubenswrapper[4904]: E1205 21:05:29.683826 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:05:41 crc kubenswrapper[4904]: I1205 21:05:41.688689 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:05:41 crc kubenswrapper[4904]: E1205 21:05:41.689581 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:05:54 crc kubenswrapper[4904]: I1205 21:05:54.682750 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:05:54 crc kubenswrapper[4904]: E1205 21:05:54.683867 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:06:09 crc kubenswrapper[4904]: I1205 21:06:09.681417 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:06:09 crc kubenswrapper[4904]: E1205 21:06:09.682279 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:06:22 crc kubenswrapper[4904]: I1205 21:06:22.681767 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:06:22 crc kubenswrapper[4904]: E1205 21:06:22.682582 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:06:36 crc kubenswrapper[4904]: I1205 21:06:36.681016 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:06:36 crc kubenswrapper[4904]: E1205 21:06:36.681837 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:06:50 crc kubenswrapper[4904]: I1205 21:06:50.680923 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:06:50 crc kubenswrapper[4904]: E1205 21:06:50.681614 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:07:03 crc kubenswrapper[4904]: I1205 21:07:03.682591 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:07:03 crc kubenswrapper[4904]: E1205 21:07:03.683534 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:07:14 crc kubenswrapper[4904]: I1205 21:07:14.682133 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:07:14 crc kubenswrapper[4904]: E1205 21:07:14.682736 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:07:28 crc kubenswrapper[4904]: I1205 21:07:28.681863 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:07:28 crc kubenswrapper[4904]: E1205 21:07:28.682544 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:07:42 crc kubenswrapper[4904]: I1205 21:07:42.682146 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:07:42 crc kubenswrapper[4904]: E1205 21:07:42.683087 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:07:53 crc kubenswrapper[4904]: I1205 21:07:53.681205 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:07:53 crc kubenswrapper[4904]: E1205 21:07:53.681981 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:08:06 crc kubenswrapper[4904]: I1205 21:08:06.681050 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:08:06 crc kubenswrapper[4904]: E1205 21:08:06.681851 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.360445 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bw8sw"] Dec 05 21:08:12 crc kubenswrapper[4904]: E1205 21:08:12.361717 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="registry-server" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.361738 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="registry-server" Dec 05 21:08:12 crc kubenswrapper[4904]: E1205 21:08:12.361772 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="extract-content" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.361784 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="extract-content" Dec 05 21:08:12 crc kubenswrapper[4904]: E1205 21:08:12.361823 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="extract-utilities" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.361837 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="extract-utilities" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.362257 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7aed77-9e72-4db6-b85f-07feb617feb6" containerName="registry-server" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.364395 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.373715 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bw8sw"] Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.476130 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-utilities\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.476246 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wsjn\" (UniqueName: \"kubernetes.io/projected/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-kube-api-access-2wsjn\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.476289 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-catalog-content\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.577958 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-utilities\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.578175 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wsjn\" (UniqueName: \"kubernetes.io/projected/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-kube-api-access-2wsjn\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.578235 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-catalog-content\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.578792 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-utilities\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.578872 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-catalog-content\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.599905 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wsjn\" (UniqueName: \"kubernetes.io/projected/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-kube-api-access-2wsjn\") pod \"redhat-operators-bw8sw\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:12 crc kubenswrapper[4904]: I1205 21:08:12.687947 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:13 crc kubenswrapper[4904]: I1205 21:08:13.205649 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bw8sw"] Dec 05 21:08:14 crc kubenswrapper[4904]: I1205 21:08:14.173257 4904 generic.go:334] "Generic (PLEG): container finished" podID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerID="f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1" exitCode=0 Dec 05 21:08:14 crc kubenswrapper[4904]: I1205 21:08:14.173347 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerDied","Data":"f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1"} Dec 05 21:08:14 crc kubenswrapper[4904]: I1205 21:08:14.173439 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerStarted","Data":"6acd170d23ab9762b3ae58629c9a9b60961cc88ec086c77137b01417327d41a1"} Dec 05 21:08:14 crc kubenswrapper[4904]: I1205 21:08:14.178969 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:08:15 crc kubenswrapper[4904]: I1205 21:08:15.187032 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerStarted","Data":"3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a"} Dec 05 21:08:18 crc kubenswrapper[4904]: I1205 21:08:18.218782 4904 generic.go:334] "Generic (PLEG): container finished" podID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerID="3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a" exitCode=0 Dec 05 21:08:18 crc kubenswrapper[4904]: I1205 21:08:18.218873 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerDied","Data":"3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a"} Dec 05 21:08:19 crc kubenswrapper[4904]: I1205 21:08:19.234211 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerStarted","Data":"2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24"} Dec 05 21:08:19 crc kubenswrapper[4904]: I1205 21:08:19.270892 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bw8sw" podStartSLOduration=2.848708238 podStartE2EDuration="7.270827032s" podCreationTimestamp="2025-12-05 21:08:12 +0000 UTC" firstStartedPulling="2025-12-05 21:08:14.17816203 +0000 UTC m=+3392.989378139" lastFinishedPulling="2025-12-05 21:08:18.600280824 +0000 UTC m=+3397.411496933" observedRunningTime="2025-12-05 21:08:19.254718434 +0000 UTC m=+3398.065934563" watchObservedRunningTime="2025-12-05 21:08:19.270827032 +0000 UTC m=+3398.082043151" Dec 05 21:08:20 crc kubenswrapper[4904]: I1205 21:08:20.681375 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:08:20 crc kubenswrapper[4904]: E1205 21:08:20.681853 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:08:22 crc kubenswrapper[4904]: I1205 21:08:22.689385 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:22 crc kubenswrapper[4904]: I1205 21:08:22.689725 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:23 crc kubenswrapper[4904]: I1205 21:08:23.750914 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bw8sw" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="registry-server" probeResult="failure" output=< Dec 05 21:08:23 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 21:08:23 crc kubenswrapper[4904]: > Dec 05 21:08:32 crc kubenswrapper[4904]: I1205 21:08:32.774226 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:32 crc kubenswrapper[4904]: I1205 21:08:32.834149 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:33 crc kubenswrapper[4904]: I1205 21:08:33.011271 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bw8sw"] Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.388784 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bw8sw" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="registry-server" containerID="cri-o://2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24" gracePeriod=2 Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.681694 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:08:34 crc kubenswrapper[4904]: E1205 21:08:34.682358 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.838950 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.993455 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wsjn\" (UniqueName: \"kubernetes.io/projected/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-kube-api-access-2wsjn\") pod \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.993555 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-utilities\") pod \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.993584 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-catalog-content\") pod \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\" (UID: \"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0\") " Dec 05 21:08:34 crc kubenswrapper[4904]: I1205 21:08:34.995899 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-utilities" (OuterVolumeSpecName: "utilities") pod "8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" (UID: "8ab087e3-b5ae-4e70-a694-47cbacc5fbf0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.002251 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-kube-api-access-2wsjn" (OuterVolumeSpecName: "kube-api-access-2wsjn") pod "8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" (UID: "8ab087e3-b5ae-4e70-a694-47cbacc5fbf0"). InnerVolumeSpecName "kube-api-access-2wsjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.096397 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wsjn\" (UniqueName: \"kubernetes.io/projected/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-kube-api-access-2wsjn\") on node \"crc\" DevicePath \"\"" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.096444 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.104489 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" (UID: "8ab087e3-b5ae-4e70-a694-47cbacc5fbf0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.199047 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.402190 4904 generic.go:334] "Generic (PLEG): container finished" podID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerID="2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24" exitCode=0 Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.402258 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerDied","Data":"2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24"} Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.402388 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bw8sw" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.403391 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bw8sw" event={"ID":"8ab087e3-b5ae-4e70-a694-47cbacc5fbf0","Type":"ContainerDied","Data":"6acd170d23ab9762b3ae58629c9a9b60961cc88ec086c77137b01417327d41a1"} Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.403450 4904 scope.go:117] "RemoveContainer" containerID="2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.427962 4904 scope.go:117] "RemoveContainer" containerID="3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.451790 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bw8sw"] Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.459251 4904 scope.go:117] "RemoveContainer" containerID="f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.464230 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bw8sw"] Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.520522 4904 scope.go:117] "RemoveContainer" containerID="2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24" Dec 05 21:08:35 crc kubenswrapper[4904]: E1205 21:08:35.521118 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24\": container with ID starting with 2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24 not found: ID does not exist" containerID="2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.521162 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24"} err="failed to get container status \"2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24\": rpc error: code = NotFound desc = could not find container \"2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24\": container with ID starting with 2af01d1be90416e2db2c38fb2737e3aa8efae1ae44776cd46739ba152f59fe24 not found: ID does not exist" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.521191 4904 scope.go:117] "RemoveContainer" containerID="3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a" Dec 05 21:08:35 crc kubenswrapper[4904]: E1205 21:08:35.521672 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a\": container with ID starting with 3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a not found: ID does not exist" containerID="3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.521730 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a"} err="failed to get container status \"3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a\": rpc error: code = NotFound desc = could not find container \"3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a\": container with ID starting with 3bd7dbdf80115ed219bb8f0cd08195252ad2bcafd71433e7022dda30b5f6f68a not found: ID does not exist" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.521775 4904 scope.go:117] "RemoveContainer" containerID="f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1" Dec 05 21:08:35 crc kubenswrapper[4904]: E1205 21:08:35.522513 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1\": container with ID starting with f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1 not found: ID does not exist" containerID="f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.522555 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1"} err="failed to get container status \"f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1\": rpc error: code = NotFound desc = could not find container \"f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1\": container with ID starting with f0c919f827aa4e2b3120dbe2fd97ba3014511b1762f6cfab646343c33afcfbc1 not found: ID does not exist" Dec 05 21:08:35 crc kubenswrapper[4904]: I1205 21:08:35.693759 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" path="/var/lib/kubelet/pods/8ab087e3-b5ae-4e70-a694-47cbacc5fbf0/volumes" Dec 05 21:08:49 crc kubenswrapper[4904]: I1205 21:08:49.682153 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:08:49 crc kubenswrapper[4904]: E1205 21:08:49.700942 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:09:04 crc kubenswrapper[4904]: I1205 21:09:04.682374 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:09:05 crc kubenswrapper[4904]: I1205 21:09:05.743284 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"ced88b4c08e1445d34b2049c866903df4f80cb41b9458bfb601c5667c60485a6"} Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.946589 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgb8b"] Dec 05 21:10:53 crc kubenswrapper[4904]: E1205 21:10:53.948342 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="extract-content" Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.948361 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="extract-content" Dec 05 21:10:53 crc kubenswrapper[4904]: E1205 21:10:53.948402 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="extract-utilities" Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.948410 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="extract-utilities" Dec 05 21:10:53 crc kubenswrapper[4904]: E1205 21:10:53.948469 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.948478 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.948699 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab087e3-b5ae-4e70-a694-47cbacc5fbf0" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.950992 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:53 crc kubenswrapper[4904]: I1205 21:10:53.964126 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgb8b"] Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.101038 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-catalog-content\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.101131 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jffkz\" (UniqueName: \"kubernetes.io/projected/cd6a639d-ebdf-493f-b435-872daa83361e-kube-api-access-jffkz\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.101159 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-utilities\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.202838 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-catalog-content\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.202906 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jffkz\" (UniqueName: \"kubernetes.io/projected/cd6a639d-ebdf-493f-b435-872daa83361e-kube-api-access-jffkz\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.202932 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-utilities\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.203652 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-catalog-content\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.203668 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-utilities\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.230822 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jffkz\" (UniqueName: \"kubernetes.io/projected/cd6a639d-ebdf-493f-b435-872daa83361e-kube-api-access-jffkz\") pod \"community-operators-xgb8b\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.288622 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.866527 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgb8b"] Dec 05 21:10:54 crc kubenswrapper[4904]: I1205 21:10:54.872326 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerStarted","Data":"fba58f339d77949e54609e10e57db9f1ca3ac3ddd02579bf606811ea8bbef901"} Dec 05 21:10:55 crc kubenswrapper[4904]: I1205 21:10:55.882695 4904 generic.go:334] "Generic (PLEG): container finished" podID="cd6a639d-ebdf-493f-b435-872daa83361e" containerID="bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e" exitCode=0 Dec 05 21:10:55 crc kubenswrapper[4904]: I1205 21:10:55.882817 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerDied","Data":"bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e"} Dec 05 21:10:56 crc kubenswrapper[4904]: I1205 21:10:56.903810 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerStarted","Data":"26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051"} Dec 05 21:10:57 crc kubenswrapper[4904]: I1205 21:10:57.918740 4904 generic.go:334] "Generic (PLEG): container finished" podID="cd6a639d-ebdf-493f-b435-872daa83361e" containerID="26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051" exitCode=0 Dec 05 21:10:57 crc kubenswrapper[4904]: I1205 21:10:57.918832 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerDied","Data":"26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051"} Dec 05 21:10:58 crc kubenswrapper[4904]: I1205 21:10:58.930674 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerStarted","Data":"e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff"} Dec 05 21:10:58 crc kubenswrapper[4904]: I1205 21:10:58.950162 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgb8b" podStartSLOduration=3.448515349 podStartE2EDuration="5.950114458s" podCreationTimestamp="2025-12-05 21:10:53 +0000 UTC" firstStartedPulling="2025-12-05 21:10:55.884972778 +0000 UTC m=+3554.696188897" lastFinishedPulling="2025-12-05 21:10:58.386571897 +0000 UTC m=+3557.197788006" observedRunningTime="2025-12-05 21:10:58.949023417 +0000 UTC m=+3557.760239546" watchObservedRunningTime="2025-12-05 21:10:58.950114458 +0000 UTC m=+3557.761330567" Dec 05 21:11:04 crc kubenswrapper[4904]: I1205 21:11:04.290445 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:11:04 crc kubenswrapper[4904]: I1205 21:11:04.291076 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:11:04 crc kubenswrapper[4904]: I1205 21:11:04.383468 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:11:05 crc kubenswrapper[4904]: I1205 21:11:05.058681 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:11:05 crc kubenswrapper[4904]: I1205 21:11:05.131762 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgb8b"] Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.012806 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgb8b" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="registry-server" containerID="cri-o://e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff" gracePeriod=2 Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.542606 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.709856 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-utilities\") pod \"cd6a639d-ebdf-493f-b435-872daa83361e\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.709934 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jffkz\" (UniqueName: \"kubernetes.io/projected/cd6a639d-ebdf-493f-b435-872daa83361e-kube-api-access-jffkz\") pod \"cd6a639d-ebdf-493f-b435-872daa83361e\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.710038 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-catalog-content\") pod \"cd6a639d-ebdf-493f-b435-872daa83361e\" (UID: \"cd6a639d-ebdf-493f-b435-872daa83361e\") " Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.711631 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-utilities" (OuterVolumeSpecName: "utilities") pod "cd6a639d-ebdf-493f-b435-872daa83361e" (UID: "cd6a639d-ebdf-493f-b435-872daa83361e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.720330 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6a639d-ebdf-493f-b435-872daa83361e-kube-api-access-jffkz" (OuterVolumeSpecName: "kube-api-access-jffkz") pod "cd6a639d-ebdf-493f-b435-872daa83361e" (UID: "cd6a639d-ebdf-493f-b435-872daa83361e"). InnerVolumeSpecName "kube-api-access-jffkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.761785 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd6a639d-ebdf-493f-b435-872daa83361e" (UID: "cd6a639d-ebdf-493f-b435-872daa83361e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.813019 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.813076 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jffkz\" (UniqueName: \"kubernetes.io/projected/cd6a639d-ebdf-493f-b435-872daa83361e-kube-api-access-jffkz\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:07 crc kubenswrapper[4904]: I1205 21:11:07.813087 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6a639d-ebdf-493f-b435-872daa83361e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.023328 4904 generic.go:334] "Generic (PLEG): container finished" podID="cd6a639d-ebdf-493f-b435-872daa83361e" containerID="e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff" exitCode=0 Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.023374 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerDied","Data":"e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff"} Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.023421 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgb8b" event={"ID":"cd6a639d-ebdf-493f-b435-872daa83361e","Type":"ContainerDied","Data":"fba58f339d77949e54609e10e57db9f1ca3ac3ddd02579bf606811ea8bbef901"} Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.023463 4904 scope.go:117] "RemoveContainer" containerID="e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.023479 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgb8b" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.053411 4904 scope.go:117] "RemoveContainer" containerID="26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.065050 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgb8b"] Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.075949 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgb8b"] Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.103655 4904 scope.go:117] "RemoveContainer" containerID="bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.135482 4904 scope.go:117] "RemoveContainer" containerID="e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff" Dec 05 21:11:08 crc kubenswrapper[4904]: E1205 21:11:08.136211 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff\": container with ID starting with e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff not found: ID does not exist" containerID="e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.136250 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff"} err="failed to get container status \"e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff\": rpc error: code = NotFound desc = could not find container \"e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff\": container with ID starting with e6dda7f5e2e3edba7471b99bf92aac68f3db35be871a2df037b6890bd06fb5ff not found: ID does not exist" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.136276 4904 scope.go:117] "RemoveContainer" containerID="26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051" Dec 05 21:11:08 crc kubenswrapper[4904]: E1205 21:11:08.136657 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051\": container with ID starting with 26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051 not found: ID does not exist" containerID="26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.136713 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051"} err="failed to get container status \"26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051\": rpc error: code = NotFound desc = could not find container \"26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051\": container with ID starting with 26c0c8cba8ef4523b68bea7f5250b3f2fd09117c8ecf4d754c1b2e28a9320051 not found: ID does not exist" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.136739 4904 scope.go:117] "RemoveContainer" containerID="bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e" Dec 05 21:11:08 crc kubenswrapper[4904]: E1205 21:11:08.137495 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e\": container with ID starting with bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e not found: ID does not exist" containerID="bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e" Dec 05 21:11:08 crc kubenswrapper[4904]: I1205 21:11:08.137521 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e"} err="failed to get container status \"bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e\": rpc error: code = NotFound desc = could not find container \"bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e\": container with ID starting with bf735d4f90c714be654034da359226e468675fc4cf8a9ac02e36a8044777ab2e not found: ID does not exist" Dec 05 21:11:09 crc kubenswrapper[4904]: I1205 21:11:09.695628 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" path="/var/lib/kubelet/pods/cd6a639d-ebdf-493f-b435-872daa83361e/volumes" Dec 05 21:11:29 crc kubenswrapper[4904]: I1205 21:11:29.955711 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:11:29 crc kubenswrapper[4904]: I1205 21:11:29.956422 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:11:59 crc kubenswrapper[4904]: I1205 21:11:59.955339 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:11:59 crc kubenswrapper[4904]: I1205 21:11:59.955966 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:12:29 crc kubenswrapper[4904]: I1205 21:12:29.955304 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:12:29 crc kubenswrapper[4904]: I1205 21:12:29.955956 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:12:29 crc kubenswrapper[4904]: I1205 21:12:29.956012 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:12:29 crc kubenswrapper[4904]: I1205 21:12:29.956819 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ced88b4c08e1445d34b2049c866903df4f80cb41b9458bfb601c5667c60485a6"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:12:29 crc kubenswrapper[4904]: I1205 21:12:29.956901 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://ced88b4c08e1445d34b2049c866903df4f80cb41b9458bfb601c5667c60485a6" gracePeriod=600 Dec 05 21:12:30 crc kubenswrapper[4904]: I1205 21:12:30.911586 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="ced88b4c08e1445d34b2049c866903df4f80cb41b9458bfb601c5667c60485a6" exitCode=0 Dec 05 21:12:30 crc kubenswrapper[4904]: I1205 21:12:30.911671 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"ced88b4c08e1445d34b2049c866903df4f80cb41b9458bfb601c5667c60485a6"} Dec 05 21:12:30 crc kubenswrapper[4904]: I1205 21:12:30.912245 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a"} Dec 05 21:12:30 crc kubenswrapper[4904]: I1205 21:12:30.912269 4904 scope.go:117] "RemoveContainer" containerID="80eef27667d8a24d40a49bd4a7aa8dff94a82a2199c9c1742d158b7a2afd2a6e" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.595317 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxhvt"] Dec 05 21:14:42 crc kubenswrapper[4904]: E1205 21:14:42.596385 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="extract-utilities" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.596405 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="extract-utilities" Dec 05 21:14:42 crc kubenswrapper[4904]: E1205 21:14:42.596429 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="registry-server" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.596438 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="registry-server" Dec 05 21:14:42 crc kubenswrapper[4904]: E1205 21:14:42.596456 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="extract-content" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.596464 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="extract-content" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.596720 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd6a639d-ebdf-493f-b435-872daa83361e" containerName="registry-server" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.598619 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.626907 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxhvt"] Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.750553 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-utilities\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.750684 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsnhw\" (UniqueName: \"kubernetes.io/projected/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-kube-api-access-jsnhw\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.750769 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-catalog-content\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.852561 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-utilities\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.852663 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsnhw\" (UniqueName: \"kubernetes.io/projected/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-kube-api-access-jsnhw\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.852706 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-catalog-content\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.853717 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-catalog-content\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.854299 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-utilities\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.872011 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsnhw\" (UniqueName: \"kubernetes.io/projected/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-kube-api-access-jsnhw\") pod \"certified-operators-xxhvt\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:42 crc kubenswrapper[4904]: I1205 21:14:42.923796 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:43 crc kubenswrapper[4904]: I1205 21:14:43.551881 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxhvt"] Dec 05 21:14:44 crc kubenswrapper[4904]: I1205 21:14:44.494905 4904 generic.go:334] "Generic (PLEG): container finished" podID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerID="cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110" exitCode=0 Dec 05 21:14:44 crc kubenswrapper[4904]: I1205 21:14:44.495094 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerDied","Data":"cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110"} Dec 05 21:14:44 crc kubenswrapper[4904]: I1205 21:14:44.495405 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerStarted","Data":"1528ea67996d68f1d4058d4b162f54ab4050f0fb7ab16e4b607ed809c00922f3"} Dec 05 21:14:44 crc kubenswrapper[4904]: I1205 21:14:44.504708 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:14:46 crc kubenswrapper[4904]: I1205 21:14:46.518309 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerStarted","Data":"e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118"} Dec 05 21:14:48 crc kubenswrapper[4904]: I1205 21:14:48.539352 4904 generic.go:334] "Generic (PLEG): container finished" podID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerID="e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118" exitCode=0 Dec 05 21:14:48 crc kubenswrapper[4904]: I1205 21:14:48.539481 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerDied","Data":"e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118"} Dec 05 21:14:49 crc kubenswrapper[4904]: I1205 21:14:49.553934 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerStarted","Data":"648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2"} Dec 05 21:14:49 crc kubenswrapper[4904]: I1205 21:14:49.578815 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxhvt" podStartSLOduration=2.986738538 podStartE2EDuration="7.578767288s" podCreationTimestamp="2025-12-05 21:14:42 +0000 UTC" firstStartedPulling="2025-12-05 21:14:44.504387394 +0000 UTC m=+3783.315603513" lastFinishedPulling="2025-12-05 21:14:49.096416154 +0000 UTC m=+3787.907632263" observedRunningTime="2025-12-05 21:14:49.575351785 +0000 UTC m=+3788.386567904" watchObservedRunningTime="2025-12-05 21:14:49.578767288 +0000 UTC m=+3788.389983397" Dec 05 21:14:52 crc kubenswrapper[4904]: I1205 21:14:52.924664 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:52 crc kubenswrapper[4904]: I1205 21:14:52.925259 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:52 crc kubenswrapper[4904]: I1205 21:14:52.975372 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:14:59 crc kubenswrapper[4904]: I1205 21:14:59.955640 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:14:59 crc kubenswrapper[4904]: I1205 21:14:59.956149 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.162322 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm"] Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.164018 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.166534 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.166887 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.174020 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm"] Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.268455 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/2840d019-94a2-4759-b8d5-e8a244032a25-kube-api-access-7lgmx\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.268726 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2840d019-94a2-4759-b8d5-e8a244032a25-secret-volume\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.268913 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2840d019-94a2-4759-b8d5-e8a244032a25-config-volume\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.370935 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/2840d019-94a2-4759-b8d5-e8a244032a25-kube-api-access-7lgmx\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.371032 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2840d019-94a2-4759-b8d5-e8a244032a25-secret-volume\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.371131 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2840d019-94a2-4759-b8d5-e8a244032a25-config-volume\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.372086 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2840d019-94a2-4759-b8d5-e8a244032a25-config-volume\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.377996 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2840d019-94a2-4759-b8d5-e8a244032a25-secret-volume\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.388196 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/2840d019-94a2-4759-b8d5-e8a244032a25-kube-api-access-7lgmx\") pod \"collect-profiles-29416155-8tvgm\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:00 crc kubenswrapper[4904]: I1205 21:15:00.534427 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:01 crc kubenswrapper[4904]: I1205 21:15:01.008135 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm"] Dec 05 21:15:01 crc kubenswrapper[4904]: I1205 21:15:01.668828 4904 generic.go:334] "Generic (PLEG): container finished" podID="2840d019-94a2-4759-b8d5-e8a244032a25" containerID="e7ea9c13048f61335723b52c60180c2780577777665c73afa0781e9a297a144d" exitCode=0 Dec 05 21:15:01 crc kubenswrapper[4904]: I1205 21:15:01.668915 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" event={"ID":"2840d019-94a2-4759-b8d5-e8a244032a25","Type":"ContainerDied","Data":"e7ea9c13048f61335723b52c60180c2780577777665c73afa0781e9a297a144d"} Dec 05 21:15:01 crc kubenswrapper[4904]: I1205 21:15:01.669373 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" event={"ID":"2840d019-94a2-4759-b8d5-e8a244032a25","Type":"ContainerStarted","Data":"bfbd6205d11f81ebfdf8fc22b3639927ad40c0740e466457e86796f538c0145d"} Dec 05 21:15:02 crc kubenswrapper[4904]: I1205 21:15:02.984683 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.045217 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxhvt"] Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.081112 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.239503 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2840d019-94a2-4759-b8d5-e8a244032a25-secret-volume\") pod \"2840d019-94a2-4759-b8d5-e8a244032a25\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.239965 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/2840d019-94a2-4759-b8d5-e8a244032a25-kube-api-access-7lgmx\") pod \"2840d019-94a2-4759-b8d5-e8a244032a25\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.240137 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2840d019-94a2-4759-b8d5-e8a244032a25-config-volume\") pod \"2840d019-94a2-4759-b8d5-e8a244032a25\" (UID: \"2840d019-94a2-4759-b8d5-e8a244032a25\") " Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.241326 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2840d019-94a2-4759-b8d5-e8a244032a25-config-volume" (OuterVolumeSpecName: "config-volume") pod "2840d019-94a2-4759-b8d5-e8a244032a25" (UID: "2840d019-94a2-4759-b8d5-e8a244032a25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.247456 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2840d019-94a2-4759-b8d5-e8a244032a25-kube-api-access-7lgmx" (OuterVolumeSpecName: "kube-api-access-7lgmx") pod "2840d019-94a2-4759-b8d5-e8a244032a25" (UID: "2840d019-94a2-4759-b8d5-e8a244032a25"). InnerVolumeSpecName "kube-api-access-7lgmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.250996 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2840d019-94a2-4759-b8d5-e8a244032a25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2840d019-94a2-4759-b8d5-e8a244032a25" (UID: "2840d019-94a2-4759-b8d5-e8a244032a25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.344047 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/2840d019-94a2-4759-b8d5-e8a244032a25-kube-api-access-7lgmx\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.344106 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2840d019-94a2-4759-b8d5-e8a244032a25-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.344117 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2840d019-94a2-4759-b8d5-e8a244032a25-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:03 crc kubenswrapper[4904]: E1205 21:15:03.422553 4904 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:49160->38.102.83.166:45757: write tcp 38.102.83.166:49160->38.102.83.166:45757: write: broken pipe Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.725031 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xxhvt" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="registry-server" containerID="cri-o://648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2" gracePeriod=2 Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.725470 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.725628 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm" event={"ID":"2840d019-94a2-4759-b8d5-e8a244032a25","Type":"ContainerDied","Data":"bfbd6205d11f81ebfdf8fc22b3639927ad40c0740e466457e86796f538c0145d"} Dec 05 21:15:03 crc kubenswrapper[4904]: I1205 21:15:03.725668 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfbd6205d11f81ebfdf8fc22b3639927ad40c0740e466457e86796f538c0145d" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.158960 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5"] Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.172120 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-8hdb5"] Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.218892 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.370161 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-catalog-content\") pod \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.370372 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-utilities\") pod \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.370520 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsnhw\" (UniqueName: \"kubernetes.io/projected/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-kube-api-access-jsnhw\") pod \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\" (UID: \"3cbfb312-804c-4dc7-ad04-804fdd6a9f14\") " Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.371830 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-utilities" (OuterVolumeSpecName: "utilities") pod "3cbfb312-804c-4dc7-ad04-804fdd6a9f14" (UID: "3cbfb312-804c-4dc7-ad04-804fdd6a9f14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.376695 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-kube-api-access-jsnhw" (OuterVolumeSpecName: "kube-api-access-jsnhw") pod "3cbfb312-804c-4dc7-ad04-804fdd6a9f14" (UID: "3cbfb312-804c-4dc7-ad04-804fdd6a9f14"). InnerVolumeSpecName "kube-api-access-jsnhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.421263 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cbfb312-804c-4dc7-ad04-804fdd6a9f14" (UID: "3cbfb312-804c-4dc7-ad04-804fdd6a9f14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.473165 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.473486 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.473626 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsnhw\" (UniqueName: \"kubernetes.io/projected/3cbfb312-804c-4dc7-ad04-804fdd6a9f14-kube-api-access-jsnhw\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.742421 4904 generic.go:334] "Generic (PLEG): container finished" podID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerID="648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2" exitCode=0 Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.742479 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhvt" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.742479 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerDied","Data":"648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2"} Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.742908 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhvt" event={"ID":"3cbfb312-804c-4dc7-ad04-804fdd6a9f14","Type":"ContainerDied","Data":"1528ea67996d68f1d4058d4b162f54ab4050f0fb7ab16e4b607ed809c00922f3"} Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.742980 4904 scope.go:117] "RemoveContainer" containerID="648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.798625 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxhvt"] Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.805919 4904 scope.go:117] "RemoveContainer" containerID="e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.808366 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xxhvt"] Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.827886 4904 scope.go:117] "RemoveContainer" containerID="cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.878592 4904 scope.go:117] "RemoveContainer" containerID="648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2" Dec 05 21:15:04 crc kubenswrapper[4904]: E1205 21:15:04.879178 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2\": container with ID starting with 648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2 not found: ID does not exist" containerID="648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.879225 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2"} err="failed to get container status \"648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2\": rpc error: code = NotFound desc = could not find container \"648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2\": container with ID starting with 648a9b7dfb876de78ad9b144e0aa6ef74db9c52c69e82905facd10c8b47e52f2 not found: ID does not exist" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.879257 4904 scope.go:117] "RemoveContainer" containerID="e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118" Dec 05 21:15:04 crc kubenswrapper[4904]: E1205 21:15:04.879639 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118\": container with ID starting with e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118 not found: ID does not exist" containerID="e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.879674 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118"} err="failed to get container status \"e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118\": rpc error: code = NotFound desc = could not find container \"e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118\": container with ID starting with e4ed89c0d13ae66377de5e54e23d06e12f6d6e8290873e9eded0cabb8bc3d118 not found: ID does not exist" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.879709 4904 scope.go:117] "RemoveContainer" containerID="cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110" Dec 05 21:15:04 crc kubenswrapper[4904]: E1205 21:15:04.880030 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110\": container with ID starting with cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110 not found: ID does not exist" containerID="cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110" Dec 05 21:15:04 crc kubenswrapper[4904]: I1205 21:15:04.880097 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110"} err="failed to get container status \"cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110\": rpc error: code = NotFound desc = could not find container \"cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110\": container with ID starting with cae39b25766b453047fa94402ab022a2c7efbf2de60a460bd98ccff7f1153110 not found: ID does not exist" Dec 05 21:15:05 crc kubenswrapper[4904]: I1205 21:15:05.711556 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" path="/var/lib/kubelet/pods/3cbfb312-804c-4dc7-ad04-804fdd6a9f14/volumes" Dec 05 21:15:05 crc kubenswrapper[4904]: I1205 21:15:05.712986 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1593c4e-1330-4e15-a120-f132191a52ab" path="/var/lib/kubelet/pods/f1593c4e-1330-4e15-a120-f132191a52ab/volumes" Dec 05 21:15:20 crc kubenswrapper[4904]: I1205 21:15:20.630345 4904 scope.go:117] "RemoveContainer" containerID="2757697cb82e3ebb38bbfaca5779caec129016b4d520b956c2a36179d5e0f826" Dec 05 21:15:29 crc kubenswrapper[4904]: I1205 21:15:29.955949 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:15:29 crc kubenswrapper[4904]: I1205 21:15:29.956693 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:15:59 crc kubenswrapper[4904]: I1205 21:15:59.955497 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:15:59 crc kubenswrapper[4904]: I1205 21:15:59.955804 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:15:59 crc kubenswrapper[4904]: I1205 21:15:59.955851 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:15:59 crc kubenswrapper[4904]: I1205 21:15:59.956688 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:15:59 crc kubenswrapper[4904]: I1205 21:15:59.956747 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" gracePeriod=600 Dec 05 21:16:00 crc kubenswrapper[4904]: E1205 21:16:00.089554 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:16:00 crc kubenswrapper[4904]: I1205 21:16:00.339075 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" exitCode=0 Dec 05 21:16:00 crc kubenswrapper[4904]: I1205 21:16:00.339103 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a"} Dec 05 21:16:00 crc kubenswrapper[4904]: I1205 21:16:00.339416 4904 scope.go:117] "RemoveContainer" containerID="ced88b4c08e1445d34b2049c866903df4f80cb41b9458bfb601c5667c60485a6" Dec 05 21:16:00 crc kubenswrapper[4904]: I1205 21:16:00.340349 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:16:00 crc kubenswrapper[4904]: E1205 21:16:00.340717 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:16:13 crc kubenswrapper[4904]: I1205 21:16:13.682715 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:16:13 crc kubenswrapper[4904]: E1205 21:16:13.683772 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:16:26 crc kubenswrapper[4904]: I1205 21:16:26.682585 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:16:26 crc kubenswrapper[4904]: E1205 21:16:26.683520 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:16:37 crc kubenswrapper[4904]: I1205 21:16:37.681990 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:16:37 crc kubenswrapper[4904]: E1205 21:16:37.683033 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:16:50 crc kubenswrapper[4904]: I1205 21:16:50.681622 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:16:50 crc kubenswrapper[4904]: E1205 21:16:50.682562 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:17:02 crc kubenswrapper[4904]: I1205 21:17:02.681212 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:17:02 crc kubenswrapper[4904]: E1205 21:17:02.682236 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:17:15 crc kubenswrapper[4904]: I1205 21:17:15.681993 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:17:15 crc kubenswrapper[4904]: E1205 21:17:15.682995 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:17:29 crc kubenswrapper[4904]: I1205 21:17:29.682496 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:17:29 crc kubenswrapper[4904]: E1205 21:17:29.685042 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:17:42 crc kubenswrapper[4904]: I1205 21:17:42.681867 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:17:42 crc kubenswrapper[4904]: E1205 21:17:42.682656 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:17:55 crc kubenswrapper[4904]: I1205 21:17:55.681681 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:17:55 crc kubenswrapper[4904]: E1205 21:17:55.682559 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:18:09 crc kubenswrapper[4904]: I1205 21:18:09.681905 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:18:09 crc kubenswrapper[4904]: E1205 21:18:09.682688 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:18:20 crc kubenswrapper[4904]: I1205 21:18:20.682405 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:18:20 crc kubenswrapper[4904]: E1205 21:18:20.683281 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:18:31 crc kubenswrapper[4904]: I1205 21:18:31.691248 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:18:31 crc kubenswrapper[4904]: E1205 21:18:31.692415 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:18:45 crc kubenswrapper[4904]: I1205 21:18:45.681117 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:18:45 crc kubenswrapper[4904]: E1205 21:18:45.681912 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:18:56 crc kubenswrapper[4904]: I1205 21:18:56.681740 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:18:56 crc kubenswrapper[4904]: E1205 21:18:56.682660 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:19:10 crc kubenswrapper[4904]: I1205 21:19:10.682210 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:19:10 crc kubenswrapper[4904]: E1205 21:19:10.683268 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:19:21 crc kubenswrapper[4904]: I1205 21:19:21.698581 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:19:21 crc kubenswrapper[4904]: E1205 21:19:21.699848 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:19:35 crc kubenswrapper[4904]: I1205 21:19:35.682985 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:19:35 crc kubenswrapper[4904]: E1205 21:19:35.683903 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:19:50 crc kubenswrapper[4904]: I1205 21:19:50.681989 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:19:50 crc kubenswrapper[4904]: E1205 21:19:50.685211 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:20:04 crc kubenswrapper[4904]: I1205 21:20:04.681491 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:20:04 crc kubenswrapper[4904]: E1205 21:20:04.682285 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:20:16 crc kubenswrapper[4904]: I1205 21:20:16.682479 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:20:16 crc kubenswrapper[4904]: E1205 21:20:16.685451 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:20:30 crc kubenswrapper[4904]: I1205 21:20:30.681723 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:20:30 crc kubenswrapper[4904]: E1205 21:20:30.682532 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:20:41 crc kubenswrapper[4904]: I1205 21:20:41.682003 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:20:41 crc kubenswrapper[4904]: E1205 21:20:41.683506 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:20:52 crc kubenswrapper[4904]: I1205 21:20:52.681216 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:20:52 crc kubenswrapper[4904]: E1205 21:20:52.682117 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:21:05 crc kubenswrapper[4904]: I1205 21:21:05.681199 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:21:06 crc kubenswrapper[4904]: I1205 21:21:06.727203 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"d62c227a92a254f712e9ea222dc26382d4779b1f84435e9ea09a651adb331e30"} Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.225007 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cvs4t"] Dec 05 21:21:42 crc kubenswrapper[4904]: E1205 21:21:42.226026 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="extract-content" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.226041 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="extract-content" Dec 05 21:21:42 crc kubenswrapper[4904]: E1205 21:21:42.228690 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2840d019-94a2-4759-b8d5-e8a244032a25" containerName="collect-profiles" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.228720 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="2840d019-94a2-4759-b8d5-e8a244032a25" containerName="collect-profiles" Dec 05 21:21:42 crc kubenswrapper[4904]: E1205 21:21:42.228756 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="registry-server" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.228772 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="registry-server" Dec 05 21:21:42 crc kubenswrapper[4904]: E1205 21:21:42.228790 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="extract-utilities" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.228798 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="extract-utilities" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.229219 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cbfb312-804c-4dc7-ad04-804fdd6a9f14" containerName="registry-server" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.229244 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="2840d019-94a2-4759-b8d5-e8a244032a25" containerName="collect-profiles" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.231172 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.254378 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cvs4t"] Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.316889 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-utilities\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.316989 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lflw5\" (UniqueName: \"kubernetes.io/projected/10b0f4cc-50bd-4003-8359-c1be784152ea-kube-api-access-lflw5\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.317053 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-catalog-content\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.419259 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lflw5\" (UniqueName: \"kubernetes.io/projected/10b0f4cc-50bd-4003-8359-c1be784152ea-kube-api-access-lflw5\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.419655 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-catalog-content\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.419955 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-utilities\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.420256 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-catalog-content\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.420454 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-utilities\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.441947 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lflw5\" (UniqueName: \"kubernetes.io/projected/10b0f4cc-50bd-4003-8359-c1be784152ea-kube-api-access-lflw5\") pod \"community-operators-cvs4t\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:42 crc kubenswrapper[4904]: I1205 21:21:42.550854 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:43 crc kubenswrapper[4904]: I1205 21:21:43.115348 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cvs4t"] Dec 05 21:21:43 crc kubenswrapper[4904]: I1205 21:21:43.130010 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerStarted","Data":"80d94d6cb1d818e3a3ca7d61988599d829e4c2d341d72ef1dd38a8e2dcdbf5ae"} Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.148271 4904 generic.go:334] "Generic (PLEG): container finished" podID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerID="320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5" exitCode=0 Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.148460 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerDied","Data":"320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5"} Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.152804 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.423337 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxcpk"] Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.426245 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.477335 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxcpk"] Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.489266 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-catalog-content\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.489487 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spr52\" (UniqueName: \"kubernetes.io/projected/01fb03e7-8227-4818-91ab-a520350b5912-kube-api-access-spr52\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.490368 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-utilities\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.592614 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-utilities\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.592893 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-catalog-content\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.593232 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-utilities\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.593371 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-catalog-content\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.593047 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spr52\" (UniqueName: \"kubernetes.io/projected/01fb03e7-8227-4818-91ab-a520350b5912-kube-api-access-spr52\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.615787 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spr52\" (UniqueName: \"kubernetes.io/projected/01fb03e7-8227-4818-91ab-a520350b5912-kube-api-access-spr52\") pod \"redhat-marketplace-fxcpk\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.634190 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phjwj"] Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.636823 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.669309 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phjwj"] Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.695347 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-catalog-content\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.695664 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-utilities\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.695693 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6pz8\" (UniqueName: \"kubernetes.io/projected/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-kube-api-access-v6pz8\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.760654 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.797610 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-catalog-content\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.797687 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-utilities\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.797719 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6pz8\" (UniqueName: \"kubernetes.io/projected/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-kube-api-access-v6pz8\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.798385 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-catalog-content\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.798499 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-utilities\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.816324 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6pz8\" (UniqueName: \"kubernetes.io/projected/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-kube-api-access-v6pz8\") pod \"redhat-operators-phjwj\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:44 crc kubenswrapper[4904]: I1205 21:21:44.997885 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:45 crc kubenswrapper[4904]: I1205 21:21:45.160090 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerStarted","Data":"2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422"} Dec 05 21:21:45 crc kubenswrapper[4904]: I1205 21:21:45.285197 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxcpk"] Dec 05 21:21:45 crc kubenswrapper[4904]: I1205 21:21:45.507029 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phjwj"] Dec 05 21:21:45 crc kubenswrapper[4904]: W1205 21:21:45.544630 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc67c0ee_2e6a_4796_8fdd_7b4546722bc0.slice/crio-fa52fbd11f68fdfd5ae869c35adffefb7620b4cb38236dbd88c7958186c6be2f WatchSource:0}: Error finding container fa52fbd11f68fdfd5ae869c35adffefb7620b4cb38236dbd88c7958186c6be2f: Status 404 returned error can't find the container with id fa52fbd11f68fdfd5ae869c35adffefb7620b4cb38236dbd88c7958186c6be2f Dec 05 21:21:46 crc kubenswrapper[4904]: I1205 21:21:46.173109 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerStarted","Data":"fa52fbd11f68fdfd5ae869c35adffefb7620b4cb38236dbd88c7958186c6be2f"} Dec 05 21:21:46 crc kubenswrapper[4904]: I1205 21:21:46.175176 4904 generic.go:334] "Generic (PLEG): container finished" podID="01fb03e7-8227-4818-91ab-a520350b5912" containerID="bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892" exitCode=0 Dec 05 21:21:46 crc kubenswrapper[4904]: I1205 21:21:46.175259 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerDied","Data":"bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892"} Dec 05 21:21:46 crc kubenswrapper[4904]: I1205 21:21:46.175400 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerStarted","Data":"a5aa61ceb3ac0669605dc9bc66c85dbbb2f85bb89215320eeb6aae48f636f1cf"} Dec 05 21:21:47 crc kubenswrapper[4904]: I1205 21:21:47.186695 4904 generic.go:334] "Generic (PLEG): container finished" podID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerID="2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422" exitCode=0 Dec 05 21:21:47 crc kubenswrapper[4904]: I1205 21:21:47.186763 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerDied","Data":"2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422"} Dec 05 21:21:47 crc kubenswrapper[4904]: I1205 21:21:47.189304 4904 generic.go:334] "Generic (PLEG): container finished" podID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerID="656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955" exitCode=0 Dec 05 21:21:47 crc kubenswrapper[4904]: I1205 21:21:47.189362 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerDied","Data":"656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955"} Dec 05 21:21:47 crc kubenswrapper[4904]: I1205 21:21:47.200943 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerStarted","Data":"8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4"} Dec 05 21:21:48 crc kubenswrapper[4904]: I1205 21:21:48.213946 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerStarted","Data":"d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a"} Dec 05 21:21:48 crc kubenswrapper[4904]: I1205 21:21:48.216829 4904 generic.go:334] "Generic (PLEG): container finished" podID="01fb03e7-8227-4818-91ab-a520350b5912" containerID="8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4" exitCode=0 Dec 05 21:21:48 crc kubenswrapper[4904]: I1205 21:21:48.216883 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerDied","Data":"8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4"} Dec 05 21:21:48 crc kubenswrapper[4904]: I1205 21:21:48.221190 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerStarted","Data":"76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee"} Dec 05 21:21:48 crc kubenswrapper[4904]: I1205 21:21:48.289361 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cvs4t" podStartSLOduration=2.864051191 podStartE2EDuration="6.289344728s" podCreationTimestamp="2025-12-05 21:21:42 +0000 UTC" firstStartedPulling="2025-12-05 21:21:44.152123337 +0000 UTC m=+4202.963339486" lastFinishedPulling="2025-12-05 21:21:47.577416904 +0000 UTC m=+4206.388633023" observedRunningTime="2025-12-05 21:21:48.275283835 +0000 UTC m=+4207.086499954" watchObservedRunningTime="2025-12-05 21:21:48.289344728 +0000 UTC m=+4207.100560837" Dec 05 21:21:51 crc kubenswrapper[4904]: I1205 21:21:51.250647 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerStarted","Data":"b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2"} Dec 05 21:21:51 crc kubenswrapper[4904]: I1205 21:21:51.271981 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxcpk" podStartSLOduration=3.735736234 podStartE2EDuration="7.271959107s" podCreationTimestamp="2025-12-05 21:21:44 +0000 UTC" firstStartedPulling="2025-12-05 21:21:46.179310388 +0000 UTC m=+4204.990526537" lastFinishedPulling="2025-12-05 21:21:49.715533301 +0000 UTC m=+4208.526749410" observedRunningTime="2025-12-05 21:21:51.266411096 +0000 UTC m=+4210.077627225" watchObservedRunningTime="2025-12-05 21:21:51.271959107 +0000 UTC m=+4210.083175216" Dec 05 21:21:52 crc kubenswrapper[4904]: I1205 21:21:52.551706 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:52 crc kubenswrapper[4904]: I1205 21:21:52.554663 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:21:53 crc kubenswrapper[4904]: I1205 21:21:53.268607 4904 generic.go:334] "Generic (PLEG): container finished" podID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerID="d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a" exitCode=0 Dec 05 21:21:53 crc kubenswrapper[4904]: I1205 21:21:53.269789 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerDied","Data":"d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a"} Dec 05 21:21:53 crc kubenswrapper[4904]: I1205 21:21:53.610766 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-cvs4t" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="registry-server" probeResult="failure" output=< Dec 05 21:21:53 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 21:21:53 crc kubenswrapper[4904]: > Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.282623 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerStarted","Data":"8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450"} Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.309083 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phjwj" podStartSLOduration=3.84554136 podStartE2EDuration="10.309044102s" podCreationTimestamp="2025-12-05 21:21:44 +0000 UTC" firstStartedPulling="2025-12-05 21:21:47.190654592 +0000 UTC m=+4206.001870711" lastFinishedPulling="2025-12-05 21:21:53.654157314 +0000 UTC m=+4212.465373453" observedRunningTime="2025-12-05 21:21:54.30713164 +0000 UTC m=+4213.118347749" watchObservedRunningTime="2025-12-05 21:21:54.309044102 +0000 UTC m=+4213.120260211" Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.761220 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.761329 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.825548 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.998586 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:54 crc kubenswrapper[4904]: I1205 21:21:54.998647 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:21:55 crc kubenswrapper[4904]: I1205 21:21:55.336176 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:56 crc kubenswrapper[4904]: I1205 21:21:56.052755 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phjwj" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="registry-server" probeResult="failure" output=< Dec 05 21:21:56 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 21:21:56 crc kubenswrapper[4904]: > Dec 05 21:21:56 crc kubenswrapper[4904]: I1205 21:21:56.212034 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxcpk"] Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.312424 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fxcpk" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="registry-server" containerID="cri-o://b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2" gracePeriod=2 Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.889656 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.976947 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-utilities\") pod \"01fb03e7-8227-4818-91ab-a520350b5912\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.977226 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-catalog-content\") pod \"01fb03e7-8227-4818-91ab-a520350b5912\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.977259 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spr52\" (UniqueName: \"kubernetes.io/projected/01fb03e7-8227-4818-91ab-a520350b5912-kube-api-access-spr52\") pod \"01fb03e7-8227-4818-91ab-a520350b5912\" (UID: \"01fb03e7-8227-4818-91ab-a520350b5912\") " Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.978338 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-utilities" (OuterVolumeSpecName: "utilities") pod "01fb03e7-8227-4818-91ab-a520350b5912" (UID: "01fb03e7-8227-4818-91ab-a520350b5912"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:21:57 crc kubenswrapper[4904]: I1205 21:21:57.985550 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fb03e7-8227-4818-91ab-a520350b5912-kube-api-access-spr52" (OuterVolumeSpecName: "kube-api-access-spr52") pod "01fb03e7-8227-4818-91ab-a520350b5912" (UID: "01fb03e7-8227-4818-91ab-a520350b5912"). InnerVolumeSpecName "kube-api-access-spr52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.002925 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01fb03e7-8227-4818-91ab-a520350b5912" (UID: "01fb03e7-8227-4818-91ab-a520350b5912"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.079956 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.079996 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spr52\" (UniqueName: \"kubernetes.io/projected/01fb03e7-8227-4818-91ab-a520350b5912-kube-api-access-spr52\") on node \"crc\" DevicePath \"\"" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.080007 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fb03e7-8227-4818-91ab-a520350b5912-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.323516 4904 generic.go:334] "Generic (PLEG): container finished" podID="01fb03e7-8227-4818-91ab-a520350b5912" containerID="b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2" exitCode=0 Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.323571 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerDied","Data":"b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2"} Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.323607 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxcpk" event={"ID":"01fb03e7-8227-4818-91ab-a520350b5912","Type":"ContainerDied","Data":"a5aa61ceb3ac0669605dc9bc66c85dbbb2f85bb89215320eeb6aae48f636f1cf"} Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.323632 4904 scope.go:117] "RemoveContainer" containerID="b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.323724 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxcpk" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.349356 4904 scope.go:117] "RemoveContainer" containerID="8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.369029 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxcpk"] Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.378849 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxcpk"] Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.396267 4904 scope.go:117] "RemoveContainer" containerID="bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.453903 4904 scope.go:117] "RemoveContainer" containerID="b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2" Dec 05 21:21:58 crc kubenswrapper[4904]: E1205 21:21:58.458546 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2\": container with ID starting with b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2 not found: ID does not exist" containerID="b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.458588 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2"} err="failed to get container status \"b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2\": rpc error: code = NotFound desc = could not find container \"b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2\": container with ID starting with b3f47171927a711c4a7cbed101546a614941d917ee880869ab0e6faa8432c3b2 not found: ID does not exist" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.458621 4904 scope.go:117] "RemoveContainer" containerID="8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4" Dec 05 21:21:58 crc kubenswrapper[4904]: E1205 21:21:58.459055 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4\": container with ID starting with 8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4 not found: ID does not exist" containerID="8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.459120 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4"} err="failed to get container status \"8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4\": rpc error: code = NotFound desc = could not find container \"8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4\": container with ID starting with 8b7ea32a850dbd9d24f41b03ffb2c6cf35a5d535d1659b9615432be4f6dd0eb4 not found: ID does not exist" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.459153 4904 scope.go:117] "RemoveContainer" containerID="bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892" Dec 05 21:21:58 crc kubenswrapper[4904]: E1205 21:21:58.459744 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892\": container with ID starting with bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892 not found: ID does not exist" containerID="bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892" Dec 05 21:21:58 crc kubenswrapper[4904]: I1205 21:21:58.459786 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892"} err="failed to get container status \"bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892\": rpc error: code = NotFound desc = could not find container \"bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892\": container with ID starting with bd614b3df2dc9be21903e3a09cf5498d8f4875bcd5d5469f67576aa2e43d0892 not found: ID does not exist" Dec 05 21:21:59 crc kubenswrapper[4904]: I1205 21:21:59.694279 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fb03e7-8227-4818-91ab-a520350b5912" path="/var/lib/kubelet/pods/01fb03e7-8227-4818-91ab-a520350b5912/volumes" Dec 05 21:22:03 crc kubenswrapper[4904]: I1205 21:22:03.065763 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:22:03 crc kubenswrapper[4904]: I1205 21:22:03.137392 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:22:03 crc kubenswrapper[4904]: I1205 21:22:03.307703 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cvs4t"] Dec 05 21:22:04 crc kubenswrapper[4904]: I1205 21:22:04.379988 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cvs4t" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="registry-server" containerID="cri-o://76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee" gracePeriod=2 Dec 05 21:22:04 crc kubenswrapper[4904]: I1205 21:22:04.870578 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.039388 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-utilities\") pod \"10b0f4cc-50bd-4003-8359-c1be784152ea\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.039534 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lflw5\" (UniqueName: \"kubernetes.io/projected/10b0f4cc-50bd-4003-8359-c1be784152ea-kube-api-access-lflw5\") pod \"10b0f4cc-50bd-4003-8359-c1be784152ea\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.039585 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-catalog-content\") pod \"10b0f4cc-50bd-4003-8359-c1be784152ea\" (UID: \"10b0f4cc-50bd-4003-8359-c1be784152ea\") " Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.040170 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-utilities" (OuterVolumeSpecName: "utilities") pod "10b0f4cc-50bd-4003-8359-c1be784152ea" (UID: "10b0f4cc-50bd-4003-8359-c1be784152ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.052677 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b0f4cc-50bd-4003-8359-c1be784152ea-kube-api-access-lflw5" (OuterVolumeSpecName: "kube-api-access-lflw5") pod "10b0f4cc-50bd-4003-8359-c1be784152ea" (UID: "10b0f4cc-50bd-4003-8359-c1be784152ea"). InnerVolumeSpecName "kube-api-access-lflw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.054257 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.089656 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10b0f4cc-50bd-4003-8359-c1be784152ea" (UID: "10b0f4cc-50bd-4003-8359-c1be784152ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.105608 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.141598 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lflw5\" (UniqueName: \"kubernetes.io/projected/10b0f4cc-50bd-4003-8359-c1be784152ea-kube-api-access-lflw5\") on node \"crc\" DevicePath \"\"" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.141634 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.141643 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b0f4cc-50bd-4003-8359-c1be784152ea-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.394260 4904 generic.go:334] "Generic (PLEG): container finished" podID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerID="76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee" exitCode=0 Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.394349 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerDied","Data":"76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee"} Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.394419 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvs4t" event={"ID":"10b0f4cc-50bd-4003-8359-c1be784152ea","Type":"ContainerDied","Data":"80d94d6cb1d818e3a3ca7d61988599d829e4c2d341d72ef1dd38a8e2dcdbf5ae"} Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.394447 4904 scope.go:117] "RemoveContainer" containerID="76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.395421 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvs4t" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.417827 4904 scope.go:117] "RemoveContainer" containerID="2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.446447 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cvs4t"] Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.457311 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cvs4t"] Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.467893 4904 scope.go:117] "RemoveContainer" containerID="320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.543559 4904 scope.go:117] "RemoveContainer" containerID="76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee" Dec 05 21:22:05 crc kubenswrapper[4904]: E1205 21:22:05.544755 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee\": container with ID starting with 76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee not found: ID does not exist" containerID="76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.544833 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee"} err="failed to get container status \"76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee\": rpc error: code = NotFound desc = could not find container \"76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee\": container with ID starting with 76a7af83be2ed0adac9688d2ea3fee4eabbcd9517538b908835b00cc2e07daee not found: ID does not exist" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.544869 4904 scope.go:117] "RemoveContainer" containerID="2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422" Dec 05 21:22:05 crc kubenswrapper[4904]: E1205 21:22:05.546752 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422\": container with ID starting with 2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422 not found: ID does not exist" containerID="2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.546790 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422"} err="failed to get container status \"2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422\": rpc error: code = NotFound desc = could not find container \"2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422\": container with ID starting with 2882e260a26914d86e0c4f728657e89f2802bec7c3da76d902d15e898a59e422 not found: ID does not exist" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.546838 4904 scope.go:117] "RemoveContainer" containerID="320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5" Dec 05 21:22:05 crc kubenswrapper[4904]: E1205 21:22:05.547419 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5\": container with ID starting with 320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5 not found: ID does not exist" containerID="320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.547448 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5"} err="failed to get container status \"320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5\": rpc error: code = NotFound desc = could not find container \"320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5\": container with ID starting with 320188066e5ae545c59b3b9ca79ca0fdc05755c0cf72d079cc68176915c03bf5 not found: ID does not exist" Dec 05 21:22:05 crc kubenswrapper[4904]: I1205 21:22:05.707835 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" path="/var/lib/kubelet/pods/10b0f4cc-50bd-4003-8359-c1be784152ea/volumes" Dec 05 21:22:07 crc kubenswrapper[4904]: I1205 21:22:07.308548 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phjwj"] Dec 05 21:22:07 crc kubenswrapper[4904]: I1205 21:22:07.309147 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phjwj" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="registry-server" containerID="cri-o://8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450" gracePeriod=2 Dec 05 21:22:07 crc kubenswrapper[4904]: I1205 21:22:07.844326 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.003351 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6pz8\" (UniqueName: \"kubernetes.io/projected/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-kube-api-access-v6pz8\") pod \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.003558 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-catalog-content\") pod \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.003600 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-utilities\") pod \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\" (UID: \"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0\") " Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.004265 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-utilities" (OuterVolumeSpecName: "utilities") pod "fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" (UID: "fc67c0ee-2e6a-4796-8fdd-7b4546722bc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.011462 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-kube-api-access-v6pz8" (OuterVolumeSpecName: "kube-api-access-v6pz8") pod "fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" (UID: "fc67c0ee-2e6a-4796-8fdd-7b4546722bc0"). InnerVolumeSpecName "kube-api-access-v6pz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.106397 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6pz8\" (UniqueName: \"kubernetes.io/projected/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-kube-api-access-v6pz8\") on node \"crc\" DevicePath \"\"" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.106431 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.114487 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" (UID: "fc67c0ee-2e6a-4796-8fdd-7b4546722bc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.208631 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.461177 4904 generic.go:334] "Generic (PLEG): container finished" podID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerID="8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450" exitCode=0 Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.461305 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerDied","Data":"8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450"} Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.461714 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phjwj" event={"ID":"fc67c0ee-2e6a-4796-8fdd-7b4546722bc0","Type":"ContainerDied","Data":"fa52fbd11f68fdfd5ae869c35adffefb7620b4cb38236dbd88c7958186c6be2f"} Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.461742 4904 scope.go:117] "RemoveContainer" containerID="8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.461372 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phjwj" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.497421 4904 scope.go:117] "RemoveContainer" containerID="d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.508686 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phjwj"] Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.515997 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phjwj"] Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.526319 4904 scope.go:117] "RemoveContainer" containerID="656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.571219 4904 scope.go:117] "RemoveContainer" containerID="8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450" Dec 05 21:22:08 crc kubenswrapper[4904]: E1205 21:22:08.571823 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450\": container with ID starting with 8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450 not found: ID does not exist" containerID="8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.571867 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450"} err="failed to get container status \"8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450\": rpc error: code = NotFound desc = could not find container \"8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450\": container with ID starting with 8541b5de6bb06a75233fb4979d4886b30e218e2619eaa96f2a200f3b14ec6450 not found: ID does not exist" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.571894 4904 scope.go:117] "RemoveContainer" containerID="d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a" Dec 05 21:22:08 crc kubenswrapper[4904]: E1205 21:22:08.572214 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a\": container with ID starting with d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a not found: ID does not exist" containerID="d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.572239 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a"} err="failed to get container status \"d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a\": rpc error: code = NotFound desc = could not find container \"d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a\": container with ID starting with d456d881efbe9233c8f107429e84627bb9dcd4e3ef7b514398c84a9bdb32ea8a not found: ID does not exist" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.572253 4904 scope.go:117] "RemoveContainer" containerID="656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955" Dec 05 21:22:08 crc kubenswrapper[4904]: E1205 21:22:08.572753 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955\": container with ID starting with 656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955 not found: ID does not exist" containerID="656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955" Dec 05 21:22:08 crc kubenswrapper[4904]: I1205 21:22:08.572794 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955"} err="failed to get container status \"656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955\": rpc error: code = NotFound desc = could not find container \"656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955\": container with ID starting with 656c4829a90f0a8f247a32cf24113b8efc691941e58ea9fb2855e5cb14d7b955 not found: ID does not exist" Dec 05 21:22:09 crc kubenswrapper[4904]: I1205 21:22:09.693725 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" path="/var/lib/kubelet/pods/fc67c0ee-2e6a-4796-8fdd-7b4546722bc0/volumes" Dec 05 21:23:29 crc kubenswrapper[4904]: I1205 21:23:29.955778 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:23:29 crc kubenswrapper[4904]: I1205 21:23:29.958364 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:23:59 crc kubenswrapper[4904]: I1205 21:23:59.955309 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:23:59 crc kubenswrapper[4904]: I1205 21:23:59.955862 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:24:29 crc kubenswrapper[4904]: I1205 21:24:29.955719 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:24:29 crc kubenswrapper[4904]: I1205 21:24:29.956386 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:24:29 crc kubenswrapper[4904]: I1205 21:24:29.956467 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:24:29 crc kubenswrapper[4904]: I1205 21:24:29.957642 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d62c227a92a254f712e9ea222dc26382d4779b1f84435e9ea09a651adb331e30"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:24:29 crc kubenswrapper[4904]: I1205 21:24:29.957804 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://d62c227a92a254f712e9ea222dc26382d4779b1f84435e9ea09a651adb331e30" gracePeriod=600 Dec 05 21:24:30 crc kubenswrapper[4904]: I1205 21:24:30.963702 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="d62c227a92a254f712e9ea222dc26382d4779b1f84435e9ea09a651adb331e30" exitCode=0 Dec 05 21:24:30 crc kubenswrapper[4904]: I1205 21:24:30.964251 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"d62c227a92a254f712e9ea222dc26382d4779b1f84435e9ea09a651adb331e30"} Dec 05 21:24:30 crc kubenswrapper[4904]: I1205 21:24:30.964356 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a"} Dec 05 21:24:30 crc kubenswrapper[4904]: I1205 21:24:30.964391 4904 scope.go:117] "RemoveContainer" containerID="4cb7053f25cc304645f900ef847d7694f582dfca6af71129c7a4baea97eb993a" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.937693 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49bxv"] Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939666 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="extract-utilities" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939687 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="extract-utilities" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939704 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="extract-content" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939713 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="extract-content" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939726 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="extract-content" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939734 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="extract-content" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939749 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939756 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939780 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="extract-utilities" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939788 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="extract-utilities" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939802 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939809 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939824 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="extract-utilities" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939831 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="extract-utilities" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939849 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939857 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: E1205 21:25:24.939883 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="extract-content" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.939890 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="extract-content" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.940181 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc67c0ee-2e6a-4796-8fdd-7b4546722bc0" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.940205 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fb03e7-8227-4818-91ab-a520350b5912" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.940222 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b0f4cc-50bd-4003-8359-c1be784152ea" containerName="registry-server" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.942928 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:24 crc kubenswrapper[4904]: I1205 21:25:24.959816 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49bxv"] Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.026753 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwhzv\" (UniqueName: \"kubernetes.io/projected/848f18cd-a59f-4d19-8f22-da5bf50c41f5-kube-api-access-mwhzv\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.026921 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-utilities\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.027009 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-catalog-content\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.128428 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-catalog-content\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.128521 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwhzv\" (UniqueName: \"kubernetes.io/projected/848f18cd-a59f-4d19-8f22-da5bf50c41f5-kube-api-access-mwhzv\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.128640 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-utilities\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.129073 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-catalog-content\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.129180 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-utilities\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.157872 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwhzv\" (UniqueName: \"kubernetes.io/projected/848f18cd-a59f-4d19-8f22-da5bf50c41f5-kube-api-access-mwhzv\") pod \"certified-operators-49bxv\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.290175 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:25 crc kubenswrapper[4904]: I1205 21:25:25.823803 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49bxv"] Dec 05 21:25:26 crc kubenswrapper[4904]: I1205 21:25:26.611047 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerStarted","Data":"43da271868169c0667c93208869153fa2171a9ad12b1daebf9d129e89d6df82f"} Dec 05 21:25:26 crc kubenswrapper[4904]: E1205 21:25:26.854631 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848f18cd_a59f_4d19_8f22_da5bf50c41f5.slice/crio-663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:25:27 crc kubenswrapper[4904]: I1205 21:25:27.621513 4904 generic.go:334] "Generic (PLEG): container finished" podID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerID="663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231" exitCode=0 Dec 05 21:25:27 crc kubenswrapper[4904]: I1205 21:25:27.621555 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerDied","Data":"663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231"} Dec 05 21:25:28 crc kubenswrapper[4904]: I1205 21:25:28.638590 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerStarted","Data":"0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a"} Dec 05 21:25:29 crc kubenswrapper[4904]: I1205 21:25:29.648523 4904 generic.go:334] "Generic (PLEG): container finished" podID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerID="0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a" exitCode=0 Dec 05 21:25:29 crc kubenswrapper[4904]: I1205 21:25:29.648564 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerDied","Data":"0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a"} Dec 05 21:25:30 crc kubenswrapper[4904]: I1205 21:25:30.661499 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerStarted","Data":"ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b"} Dec 05 21:25:30 crc kubenswrapper[4904]: I1205 21:25:30.687470 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49bxv" podStartSLOduration=4.12428157 podStartE2EDuration="6.687416354s" podCreationTimestamp="2025-12-05 21:25:24 +0000 UTC" firstStartedPulling="2025-12-05 21:25:27.62361384 +0000 UTC m=+4426.434829959" lastFinishedPulling="2025-12-05 21:25:30.186748634 +0000 UTC m=+4428.997964743" observedRunningTime="2025-12-05 21:25:30.679743695 +0000 UTC m=+4429.490959804" watchObservedRunningTime="2025-12-05 21:25:30.687416354 +0000 UTC m=+4429.498632463" Dec 05 21:25:35 crc kubenswrapper[4904]: I1205 21:25:35.291049 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:35 crc kubenswrapper[4904]: I1205 21:25:35.291439 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:35 crc kubenswrapper[4904]: I1205 21:25:35.440433 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:35 crc kubenswrapper[4904]: I1205 21:25:35.760554 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:35 crc kubenswrapper[4904]: I1205 21:25:35.808904 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49bxv"] Dec 05 21:25:37 crc kubenswrapper[4904]: I1205 21:25:37.729740 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49bxv" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="registry-server" containerID="cri-o://ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b" gracePeriod=2 Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.330790 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.373103 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwhzv\" (UniqueName: \"kubernetes.io/projected/848f18cd-a59f-4d19-8f22-da5bf50c41f5-kube-api-access-mwhzv\") pod \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.373188 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-catalog-content\") pod \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.373296 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-utilities\") pod \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\" (UID: \"848f18cd-a59f-4d19-8f22-da5bf50c41f5\") " Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.374795 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-utilities" (OuterVolumeSpecName: "utilities") pod "848f18cd-a59f-4d19-8f22-da5bf50c41f5" (UID: "848f18cd-a59f-4d19-8f22-da5bf50c41f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.383831 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848f18cd-a59f-4d19-8f22-da5bf50c41f5-kube-api-access-mwhzv" (OuterVolumeSpecName: "kube-api-access-mwhzv") pod "848f18cd-a59f-4d19-8f22-da5bf50c41f5" (UID: "848f18cd-a59f-4d19-8f22-da5bf50c41f5"). InnerVolumeSpecName "kube-api-access-mwhzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.438425 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "848f18cd-a59f-4d19-8f22-da5bf50c41f5" (UID: "848f18cd-a59f-4d19-8f22-da5bf50c41f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.475767 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwhzv\" (UniqueName: \"kubernetes.io/projected/848f18cd-a59f-4d19-8f22-da5bf50c41f5-kube-api-access-mwhzv\") on node \"crc\" DevicePath \"\"" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.475824 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.475850 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848f18cd-a59f-4d19-8f22-da5bf50c41f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.747237 4904 generic.go:334] "Generic (PLEG): container finished" podID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerID="ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b" exitCode=0 Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.747327 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerDied","Data":"ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b"} Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.747563 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49bxv" event={"ID":"848f18cd-a59f-4d19-8f22-da5bf50c41f5","Type":"ContainerDied","Data":"43da271868169c0667c93208869153fa2171a9ad12b1daebf9d129e89d6df82f"} Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.747624 4904 scope.go:117] "RemoveContainer" containerID="ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.747387 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49bxv" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.781885 4904 scope.go:117] "RemoveContainer" containerID="0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.790825 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49bxv"] Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.802680 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49bxv"] Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.819625 4904 scope.go:117] "RemoveContainer" containerID="663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.880617 4904 scope.go:117] "RemoveContainer" containerID="ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b" Dec 05 21:25:38 crc kubenswrapper[4904]: E1205 21:25:38.882370 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b\": container with ID starting with ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b not found: ID does not exist" containerID="ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.882406 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b"} err="failed to get container status \"ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b\": rpc error: code = NotFound desc = could not find container \"ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b\": container with ID starting with ffafe649166da0b2c1306748ed6b1a6cdbd9c8754a0589138e4388bdde5b8f6b not found: ID does not exist" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.882433 4904 scope.go:117] "RemoveContainer" containerID="0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a" Dec 05 21:25:38 crc kubenswrapper[4904]: E1205 21:25:38.884169 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a\": container with ID starting with 0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a not found: ID does not exist" containerID="0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.884416 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a"} err="failed to get container status \"0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a\": rpc error: code = NotFound desc = could not find container \"0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a\": container with ID starting with 0531013f291a46fb9fe1982ea2564b11860bbf5b0c29362363c6ed84b555908a not found: ID does not exist" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.884451 4904 scope.go:117] "RemoveContainer" containerID="663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231" Dec 05 21:25:38 crc kubenswrapper[4904]: E1205 21:25:38.887219 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231\": container with ID starting with 663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231 not found: ID does not exist" containerID="663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231" Dec 05 21:25:38 crc kubenswrapper[4904]: I1205 21:25:38.887263 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231"} err="failed to get container status \"663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231\": rpc error: code = NotFound desc = could not find container \"663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231\": container with ID starting with 663e7d529fc9679dead4641ff85e90ef8e58a2ed98b8fd838869b574519f2231 not found: ID does not exist" Dec 05 21:25:39 crc kubenswrapper[4904]: I1205 21:25:39.692718 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" path="/var/lib/kubelet/pods/848f18cd-a59f-4d19-8f22-da5bf50c41f5/volumes" Dec 05 21:26:59 crc kubenswrapper[4904]: I1205 21:26:59.955769 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:26:59 crc kubenswrapper[4904]: I1205 21:26:59.956364 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:27:29 crc kubenswrapper[4904]: I1205 21:27:29.955387 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:27:29 crc kubenswrapper[4904]: I1205 21:27:29.955929 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:27:59 crc kubenswrapper[4904]: I1205 21:27:59.955466 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:27:59 crc kubenswrapper[4904]: I1205 21:27:59.957442 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:27:59 crc kubenswrapper[4904]: I1205 21:27:59.957535 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:27:59 crc kubenswrapper[4904]: I1205 21:27:59.958774 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:27:59 crc kubenswrapper[4904]: I1205 21:27:59.958885 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" gracePeriod=600 Dec 05 21:28:00 crc kubenswrapper[4904]: E1205 21:28:00.812464 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:28:01 crc kubenswrapper[4904]: I1205 21:28:01.160719 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" exitCode=0 Dec 05 21:28:01 crc kubenswrapper[4904]: I1205 21:28:01.161804 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a"} Dec 05 21:28:01 crc kubenswrapper[4904]: I1205 21:28:01.161941 4904 scope.go:117] "RemoveContainer" containerID="d62c227a92a254f712e9ea222dc26382d4779b1f84435e9ea09a651adb331e30" Dec 05 21:28:01 crc kubenswrapper[4904]: I1205 21:28:01.162694 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:28:01 crc kubenswrapper[4904]: E1205 21:28:01.163092 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:28:14 crc kubenswrapper[4904]: I1205 21:28:14.685192 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:28:14 crc kubenswrapper[4904]: E1205 21:28:14.686309 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:28:27 crc kubenswrapper[4904]: I1205 21:28:27.681820 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:28:27 crc kubenswrapper[4904]: E1205 21:28:27.682789 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:28:39 crc kubenswrapper[4904]: I1205 21:28:39.681837 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:28:39 crc kubenswrapper[4904]: E1205 21:28:39.682813 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:28:50 crc kubenswrapper[4904]: I1205 21:28:50.682131 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:28:50 crc kubenswrapper[4904]: E1205 21:28:50.682968 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:29:03 crc kubenswrapper[4904]: I1205 21:29:03.684690 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:29:03 crc kubenswrapper[4904]: E1205 21:29:03.685275 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:29:14 crc kubenswrapper[4904]: I1205 21:29:14.682242 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:29:14 crc kubenswrapper[4904]: E1205 21:29:14.683439 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:29:27 crc kubenswrapper[4904]: I1205 21:29:27.682230 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:29:27 crc kubenswrapper[4904]: E1205 21:29:27.684326 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:29:38 crc kubenswrapper[4904]: I1205 21:29:38.682102 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:29:38 crc kubenswrapper[4904]: E1205 21:29:38.683398 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:29:52 crc kubenswrapper[4904]: I1205 21:29:52.681563 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:29:52 crc kubenswrapper[4904]: E1205 21:29:52.682368 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.171861 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h"] Dec 05 21:30:00 crc kubenswrapper[4904]: E1205 21:30:00.173010 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="extract-utilities" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.173030 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="extract-utilities" Dec 05 21:30:00 crc kubenswrapper[4904]: E1205 21:30:00.173052 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="extract-content" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.173083 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="extract-content" Dec 05 21:30:00 crc kubenswrapper[4904]: E1205 21:30:00.173111 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="registry-server" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.173119 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="registry-server" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.173376 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="848f18cd-a59f-4d19-8f22-da5bf50c41f5" containerName="registry-server" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.174415 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.178450 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.181987 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.184771 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h"] Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.277535 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/716868e1-74c0-419f-a5a9-f3533fc3642f-config-volume\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.277608 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/716868e1-74c0-419f-a5a9-f3533fc3642f-secret-volume\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.277688 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs89b\" (UniqueName: \"kubernetes.io/projected/716868e1-74c0-419f-a5a9-f3533fc3642f-kube-api-access-xs89b\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.379761 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/716868e1-74c0-419f-a5a9-f3533fc3642f-config-volume\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.379833 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/716868e1-74c0-419f-a5a9-f3533fc3642f-secret-volume\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.379908 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs89b\" (UniqueName: \"kubernetes.io/projected/716868e1-74c0-419f-a5a9-f3533fc3642f-kube-api-access-xs89b\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.381421 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/716868e1-74c0-419f-a5a9-f3533fc3642f-config-volume\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.386748 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/716868e1-74c0-419f-a5a9-f3533fc3642f-secret-volume\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.398455 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs89b\" (UniqueName: \"kubernetes.io/projected/716868e1-74c0-419f-a5a9-f3533fc3642f-kube-api-access-xs89b\") pod \"collect-profiles-29416170-fxz4h\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.498928 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:00 crc kubenswrapper[4904]: I1205 21:30:00.952434 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h"] Dec 05 21:30:01 crc kubenswrapper[4904]: I1205 21:30:01.402348 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" event={"ID":"716868e1-74c0-419f-a5a9-f3533fc3642f","Type":"ContainerStarted","Data":"92648a61df59f570ff012c806f8d049c81313c6501e786c23ca174a00db8733f"} Dec 05 21:30:01 crc kubenswrapper[4904]: I1205 21:30:01.402872 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" event={"ID":"716868e1-74c0-419f-a5a9-f3533fc3642f","Type":"ContainerStarted","Data":"27e5b7dbb6e0c2a26def86019633d43e412857f04e7caf9487ab5bd4c54cfc07"} Dec 05 21:30:01 crc kubenswrapper[4904]: I1205 21:30:01.428797 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" podStartSLOduration=1.428772147 podStartE2EDuration="1.428772147s" podCreationTimestamp="2025-12-05 21:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:30:01.420159222 +0000 UTC m=+4700.231375341" watchObservedRunningTime="2025-12-05 21:30:01.428772147 +0000 UTC m=+4700.239988256" Dec 05 21:30:02 crc kubenswrapper[4904]: I1205 21:30:02.412914 4904 generic.go:334] "Generic (PLEG): container finished" podID="716868e1-74c0-419f-a5a9-f3533fc3642f" containerID="92648a61df59f570ff012c806f8d049c81313c6501e786c23ca174a00db8733f" exitCode=0 Dec 05 21:30:02 crc kubenswrapper[4904]: I1205 21:30:02.413037 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" event={"ID":"716868e1-74c0-419f-a5a9-f3533fc3642f","Type":"ContainerDied","Data":"92648a61df59f570ff012c806f8d049c81313c6501e786c23ca174a00db8733f"} Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.870726 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.976617 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/716868e1-74c0-419f-a5a9-f3533fc3642f-secret-volume\") pod \"716868e1-74c0-419f-a5a9-f3533fc3642f\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.976686 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/716868e1-74c0-419f-a5a9-f3533fc3642f-config-volume\") pod \"716868e1-74c0-419f-a5a9-f3533fc3642f\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.976941 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs89b\" (UniqueName: \"kubernetes.io/projected/716868e1-74c0-419f-a5a9-f3533fc3642f-kube-api-access-xs89b\") pod \"716868e1-74c0-419f-a5a9-f3533fc3642f\" (UID: \"716868e1-74c0-419f-a5a9-f3533fc3642f\") " Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.978686 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716868e1-74c0-419f-a5a9-f3533fc3642f-config-volume" (OuterVolumeSpecName: "config-volume") pod "716868e1-74c0-419f-a5a9-f3533fc3642f" (UID: "716868e1-74c0-419f-a5a9-f3533fc3642f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.991246 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716868e1-74c0-419f-a5a9-f3533fc3642f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "716868e1-74c0-419f-a5a9-f3533fc3642f" (UID: "716868e1-74c0-419f-a5a9-f3533fc3642f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:30:03 crc kubenswrapper[4904]: I1205 21:30:03.991443 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716868e1-74c0-419f-a5a9-f3533fc3642f-kube-api-access-xs89b" (OuterVolumeSpecName: "kube-api-access-xs89b") pod "716868e1-74c0-419f-a5a9-f3533fc3642f" (UID: "716868e1-74c0-419f-a5a9-f3533fc3642f"). InnerVolumeSpecName "kube-api-access-xs89b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.079383 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs89b\" (UniqueName: \"kubernetes.io/projected/716868e1-74c0-419f-a5a9-f3533fc3642f-kube-api-access-xs89b\") on node \"crc\" DevicePath \"\"" Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.079426 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/716868e1-74c0-419f-a5a9-f3533fc3642f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.079438 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/716868e1-74c0-419f-a5a9-f3533fc3642f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.437966 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" event={"ID":"716868e1-74c0-419f-a5a9-f3533fc3642f","Type":"ContainerDied","Data":"27e5b7dbb6e0c2a26def86019633d43e412857f04e7caf9487ab5bd4c54cfc07"} Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.438033 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e5b7dbb6e0c2a26def86019633d43e412857f04e7caf9487ab5bd4c54cfc07" Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.438124 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-fxz4h" Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.498997 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7"] Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.508193 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-59tk7"] Dec 05 21:30:04 crc kubenswrapper[4904]: I1205 21:30:04.682367 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:30:04 crc kubenswrapper[4904]: E1205 21:30:04.682706 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:30:05 crc kubenswrapper[4904]: I1205 21:30:05.691563 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9336dc5-137a-48b7-910c-bfeae84e73f8" path="/var/lib/kubelet/pods/b9336dc5-137a-48b7-910c-bfeae84e73f8/volumes" Dec 05 21:30:19 crc kubenswrapper[4904]: I1205 21:30:19.681718 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:30:19 crc kubenswrapper[4904]: E1205 21:30:19.682510 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:30:21 crc kubenswrapper[4904]: I1205 21:30:21.085447 4904 scope.go:117] "RemoveContainer" containerID="228b32c7650604773aa3265df6ef4692a8cc6824d7a894885dc55d3c06b2b301" Dec 05 21:30:30 crc kubenswrapper[4904]: I1205 21:30:30.682168 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:30:30 crc kubenswrapper[4904]: E1205 21:30:30.683193 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:30:42 crc kubenswrapper[4904]: I1205 21:30:42.681431 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:30:42 crc kubenswrapper[4904]: E1205 21:30:42.682601 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:30:55 crc kubenswrapper[4904]: I1205 21:30:55.683913 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:30:55 crc kubenswrapper[4904]: E1205 21:30:55.684567 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:31:10 crc kubenswrapper[4904]: I1205 21:31:10.681399 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:31:10 crc kubenswrapper[4904]: E1205 21:31:10.682294 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:31:24 crc kubenswrapper[4904]: I1205 21:31:24.681185 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:31:24 crc kubenswrapper[4904]: E1205 21:31:24.682032 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:31:39 crc kubenswrapper[4904]: I1205 21:31:39.682048 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:31:39 crc kubenswrapper[4904]: E1205 21:31:39.682884 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:31:50 crc kubenswrapper[4904]: I1205 21:31:50.681948 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:31:50 crc kubenswrapper[4904]: E1205 21:31:50.682996 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:32:02 crc kubenswrapper[4904]: I1205 21:32:02.681809 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:32:02 crc kubenswrapper[4904]: E1205 21:32:02.682853 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:32:16 crc kubenswrapper[4904]: I1205 21:32:16.681868 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:32:16 crc kubenswrapper[4904]: E1205 21:32:16.682646 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:32:30 crc kubenswrapper[4904]: I1205 21:32:30.681925 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:32:30 crc kubenswrapper[4904]: E1205 21:32:30.683216 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.450743 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7jtgl"] Dec 05 21:32:38 crc kubenswrapper[4904]: E1205 21:32:38.452838 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716868e1-74c0-419f-a5a9-f3533fc3642f" containerName="collect-profiles" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.452858 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="716868e1-74c0-419f-a5a9-f3533fc3642f" containerName="collect-profiles" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.453207 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="716868e1-74c0-419f-a5a9-f3533fc3642f" containerName="collect-profiles" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.455231 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.475893 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jtgl"] Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.570440 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqqb\" (UniqueName: \"kubernetes.io/projected/076868f7-2488-4886-bcc3-af5137acf8c4-kube-api-access-kgqqb\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.570489 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-utilities\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.570513 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-catalog-content\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.672317 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqqb\" (UniqueName: \"kubernetes.io/projected/076868f7-2488-4886-bcc3-af5137acf8c4-kube-api-access-kgqqb\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.672380 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-utilities\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.672410 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-catalog-content\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.672977 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-catalog-content\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.673354 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-utilities\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.699497 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqqb\" (UniqueName: \"kubernetes.io/projected/076868f7-2488-4886-bcc3-af5137acf8c4-kube-api-access-kgqqb\") pod \"redhat-marketplace-7jtgl\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:38 crc kubenswrapper[4904]: I1205 21:32:38.781337 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:39 crc kubenswrapper[4904]: I1205 21:32:39.311530 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jtgl"] Dec 05 21:32:40 crc kubenswrapper[4904]: I1205 21:32:40.015153 4904 generic.go:334] "Generic (PLEG): container finished" podID="076868f7-2488-4886-bcc3-af5137acf8c4" containerID="f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d" exitCode=0 Dec 05 21:32:40 crc kubenswrapper[4904]: I1205 21:32:40.015267 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerDied","Data":"f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d"} Dec 05 21:32:40 crc kubenswrapper[4904]: I1205 21:32:40.016925 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerStarted","Data":"c1d0f8ca65f173d4631fd004c381e580ed9d193679aff4cb051545da4c9d1c49"} Dec 05 21:32:40 crc kubenswrapper[4904]: I1205 21:32:40.017881 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:32:41 crc kubenswrapper[4904]: I1205 21:32:41.027924 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerStarted","Data":"056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098"} Dec 05 21:32:42 crc kubenswrapper[4904]: I1205 21:32:42.041380 4904 generic.go:334] "Generic (PLEG): container finished" podID="076868f7-2488-4886-bcc3-af5137acf8c4" containerID="056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098" exitCode=0 Dec 05 21:32:42 crc kubenswrapper[4904]: I1205 21:32:42.041457 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerDied","Data":"056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098"} Dec 05 21:32:42 crc kubenswrapper[4904]: I1205 21:32:42.681282 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:32:42 crc kubenswrapper[4904]: E1205 21:32:42.681839 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:32:43 crc kubenswrapper[4904]: I1205 21:32:43.055688 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerStarted","Data":"9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1"} Dec 05 21:32:43 crc kubenswrapper[4904]: I1205 21:32:43.076733 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7jtgl" podStartSLOduration=2.618677612 podStartE2EDuration="5.076707644s" podCreationTimestamp="2025-12-05 21:32:38 +0000 UTC" firstStartedPulling="2025-12-05 21:32:40.017601754 +0000 UTC m=+4858.828817863" lastFinishedPulling="2025-12-05 21:32:42.475631766 +0000 UTC m=+4861.286847895" observedRunningTime="2025-12-05 21:32:43.075203302 +0000 UTC m=+4861.886419431" watchObservedRunningTime="2025-12-05 21:32:43.076707644 +0000 UTC m=+4861.887923753" Dec 05 21:32:48 crc kubenswrapper[4904]: I1205 21:32:48.781588 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:48 crc kubenswrapper[4904]: I1205 21:32:48.784669 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:48 crc kubenswrapper[4904]: I1205 21:32:48.833855 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:49 crc kubenswrapper[4904]: I1205 21:32:49.436923 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:49 crc kubenswrapper[4904]: I1205 21:32:49.498951 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jtgl"] Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.130581 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7jtgl" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="registry-server" containerID="cri-o://9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1" gracePeriod=2 Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.604134 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.803196 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgqqb\" (UniqueName: \"kubernetes.io/projected/076868f7-2488-4886-bcc3-af5137acf8c4-kube-api-access-kgqqb\") pod \"076868f7-2488-4886-bcc3-af5137acf8c4\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.803277 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-catalog-content\") pod \"076868f7-2488-4886-bcc3-af5137acf8c4\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.803663 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-utilities\") pod \"076868f7-2488-4886-bcc3-af5137acf8c4\" (UID: \"076868f7-2488-4886-bcc3-af5137acf8c4\") " Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.804788 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-utilities" (OuterVolumeSpecName: "utilities") pod "076868f7-2488-4886-bcc3-af5137acf8c4" (UID: "076868f7-2488-4886-bcc3-af5137acf8c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.812977 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076868f7-2488-4886-bcc3-af5137acf8c4-kube-api-access-kgqqb" (OuterVolumeSpecName: "kube-api-access-kgqqb") pod "076868f7-2488-4886-bcc3-af5137acf8c4" (UID: "076868f7-2488-4886-bcc3-af5137acf8c4"). InnerVolumeSpecName "kube-api-access-kgqqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.831913 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "076868f7-2488-4886-bcc3-af5137acf8c4" (UID: "076868f7-2488-4886-bcc3-af5137acf8c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.906982 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgqqb\" (UniqueName: \"kubernetes.io/projected/076868f7-2488-4886-bcc3-af5137acf8c4-kube-api-access-kgqqb\") on node \"crc\" DevicePath \"\"" Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.907023 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:32:51 crc kubenswrapper[4904]: I1205 21:32:51.907041 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076868f7-2488-4886-bcc3-af5137acf8c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.150344 4904 generic.go:334] "Generic (PLEG): container finished" podID="076868f7-2488-4886-bcc3-af5137acf8c4" containerID="9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1" exitCode=0 Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.150448 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerDied","Data":"9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1"} Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.150459 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jtgl" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.150494 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jtgl" event={"ID":"076868f7-2488-4886-bcc3-af5137acf8c4","Type":"ContainerDied","Data":"c1d0f8ca65f173d4631fd004c381e580ed9d193679aff4cb051545da4c9d1c49"} Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.150525 4904 scope.go:117] "RemoveContainer" containerID="9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.184423 4904 scope.go:117] "RemoveContainer" containerID="056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.209198 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jtgl"] Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.222220 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jtgl"] Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.227032 4904 scope.go:117] "RemoveContainer" containerID="f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.266196 4904 scope.go:117] "RemoveContainer" containerID="9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1" Dec 05 21:32:52 crc kubenswrapper[4904]: E1205 21:32:52.267682 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1\": container with ID starting with 9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1 not found: ID does not exist" containerID="9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.267738 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1"} err="failed to get container status \"9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1\": rpc error: code = NotFound desc = could not find container \"9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1\": container with ID starting with 9232ef1afd45cf1fc152cd965d9c1ce5b6a7d3b83511f9454154f99dfd7ff0f1 not found: ID does not exist" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.267816 4904 scope.go:117] "RemoveContainer" containerID="056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098" Dec 05 21:32:52 crc kubenswrapper[4904]: E1205 21:32:52.268286 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098\": container with ID starting with 056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098 not found: ID does not exist" containerID="056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.268329 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098"} err="failed to get container status \"056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098\": rpc error: code = NotFound desc = could not find container \"056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098\": container with ID starting with 056fa10681fb5229e433125feb22c878b11482d6dff4ddf798df141573c28098 not found: ID does not exist" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.268384 4904 scope.go:117] "RemoveContainer" containerID="f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d" Dec 05 21:32:52 crc kubenswrapper[4904]: E1205 21:32:52.268692 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d\": container with ID starting with f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d not found: ID does not exist" containerID="f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d" Dec 05 21:32:52 crc kubenswrapper[4904]: I1205 21:32:52.268737 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d"} err="failed to get container status \"f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d\": rpc error: code = NotFound desc = could not find container \"f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d\": container with ID starting with f1f856e7603da73f0aeb581b8638b6cc622a6c3747cad7e08db2b134023a628d not found: ID does not exist" Dec 05 21:32:53 crc kubenswrapper[4904]: I1205 21:32:53.693536 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" path="/var/lib/kubelet/pods/076868f7-2488-4886-bcc3-af5137acf8c4/volumes" Dec 05 21:32:56 crc kubenswrapper[4904]: I1205 21:32:56.681822 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:32:56 crc kubenswrapper[4904]: E1205 21:32:56.682306 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:33:11 crc kubenswrapper[4904]: I1205 21:33:11.721210 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:33:13 crc kubenswrapper[4904]: I1205 21:33:13.386506 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"f7d7f09e13152e0a4faedadfe2a18bad5ec893c3b518c6efca98f32916ef57b3"} Dec 05 21:35:29 crc kubenswrapper[4904]: I1205 21:35:29.955415 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:35:29 crc kubenswrapper[4904]: I1205 21:35:29.955997 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:35:59 crc kubenswrapper[4904]: I1205 21:35:59.955308 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:35:59 crc kubenswrapper[4904]: I1205 21:35:59.955779 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.643926 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hq9sx"] Dec 05 21:36:08 crc kubenswrapper[4904]: E1205 21:36:08.644929 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="extract-content" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.644945 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="extract-content" Dec 05 21:36:08 crc kubenswrapper[4904]: E1205 21:36:08.644972 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="extract-utilities" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.644979 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="extract-utilities" Dec 05 21:36:08 crc kubenswrapper[4904]: E1205 21:36:08.644994 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="registry-server" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.645001 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="registry-server" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.645264 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="076868f7-2488-4886-bcc3-af5137acf8c4" containerName="registry-server" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.646869 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.655441 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hq9sx"] Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.696563 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-utilities\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.696756 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv52x\" (UniqueName: \"kubernetes.io/projected/e4672fe0-7762-447b-a91a-d553d445ed72-kube-api-access-mv52x\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.697000 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-catalog-content\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.799302 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-utilities\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.799383 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv52x\" (UniqueName: \"kubernetes.io/projected/e4672fe0-7762-447b-a91a-d553d445ed72-kube-api-access-mv52x\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.799450 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-catalog-content\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.799889 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-utilities\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.799913 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-catalog-content\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.827792 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv52x\" (UniqueName: \"kubernetes.io/projected/e4672fe0-7762-447b-a91a-d553d445ed72-kube-api-access-mv52x\") pod \"certified-operators-hq9sx\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:08 crc kubenswrapper[4904]: I1205 21:36:08.995099 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:09 crc kubenswrapper[4904]: I1205 21:36:09.640693 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hq9sx"] Dec 05 21:36:09 crc kubenswrapper[4904]: W1205 21:36:09.646414 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4672fe0_7762_447b_a91a_d553d445ed72.slice/crio-745ca1c7a682cfb70c52512a72a2f3a58645f69096b4b6d49bb3df47e1370c6d WatchSource:0}: Error finding container 745ca1c7a682cfb70c52512a72a2f3a58645f69096b4b6d49bb3df47e1370c6d: Status 404 returned error can't find the container with id 745ca1c7a682cfb70c52512a72a2f3a58645f69096b4b6d49bb3df47e1370c6d Dec 05 21:36:10 crc kubenswrapper[4904]: I1205 21:36:10.277465 4904 generic.go:334] "Generic (PLEG): container finished" podID="e4672fe0-7762-447b-a91a-d553d445ed72" containerID="e59dfada78b07f3846d6f400e28205242a09829b6bb44f4133eae0f425aa38b9" exitCode=0 Dec 05 21:36:10 crc kubenswrapper[4904]: I1205 21:36:10.277608 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerDied","Data":"e59dfada78b07f3846d6f400e28205242a09829b6bb44f4133eae0f425aa38b9"} Dec 05 21:36:10 crc kubenswrapper[4904]: I1205 21:36:10.277920 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerStarted","Data":"745ca1c7a682cfb70c52512a72a2f3a58645f69096b4b6d49bb3df47e1370c6d"} Dec 05 21:36:11 crc kubenswrapper[4904]: I1205 21:36:11.289803 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerStarted","Data":"bf3663ce4e7106d3ab1fc515bd63bf3e211850dd53bd8e5a588e333e5a700a5a"} Dec 05 21:36:12 crc kubenswrapper[4904]: I1205 21:36:12.299777 4904 generic.go:334] "Generic (PLEG): container finished" podID="e4672fe0-7762-447b-a91a-d553d445ed72" containerID="bf3663ce4e7106d3ab1fc515bd63bf3e211850dd53bd8e5a588e333e5a700a5a" exitCode=0 Dec 05 21:36:12 crc kubenswrapper[4904]: I1205 21:36:12.299884 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerDied","Data":"bf3663ce4e7106d3ab1fc515bd63bf3e211850dd53bd8e5a588e333e5a700a5a"} Dec 05 21:36:13 crc kubenswrapper[4904]: I1205 21:36:13.340263 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerStarted","Data":"d3f5f33d02337e01145560b52232e4be3f1b715b28e27a29ba158644acd438f0"} Dec 05 21:36:13 crc kubenswrapper[4904]: I1205 21:36:13.368305 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hq9sx" podStartSLOduration=2.891833451 podStartE2EDuration="5.368251465s" podCreationTimestamp="2025-12-05 21:36:08 +0000 UTC" firstStartedPulling="2025-12-05 21:36:10.279288441 +0000 UTC m=+5069.090504540" lastFinishedPulling="2025-12-05 21:36:12.755706445 +0000 UTC m=+5071.566922554" observedRunningTime="2025-12-05 21:36:13.361146951 +0000 UTC m=+5072.172363080" watchObservedRunningTime="2025-12-05 21:36:13.368251465 +0000 UTC m=+5072.179467574" Dec 05 21:36:18 crc kubenswrapper[4904]: I1205 21:36:18.996218 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:18 crc kubenswrapper[4904]: I1205 21:36:18.996693 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:19 crc kubenswrapper[4904]: I1205 21:36:19.065829 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:19 crc kubenswrapper[4904]: I1205 21:36:19.481432 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:19 crc kubenswrapper[4904]: I1205 21:36:19.532788 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hq9sx"] Dec 05 21:36:21 crc kubenswrapper[4904]: I1205 21:36:21.424079 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hq9sx" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="registry-server" containerID="cri-o://d3f5f33d02337e01145560b52232e4be3f1b715b28e27a29ba158644acd438f0" gracePeriod=2 Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.434263 4904 generic.go:334] "Generic (PLEG): container finished" podID="e4672fe0-7762-447b-a91a-d553d445ed72" containerID="d3f5f33d02337e01145560b52232e4be3f1b715b28e27a29ba158644acd438f0" exitCode=0 Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.434334 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerDied","Data":"d3f5f33d02337e01145560b52232e4be3f1b715b28e27a29ba158644acd438f0"} Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.434572 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hq9sx" event={"ID":"e4672fe0-7762-447b-a91a-d553d445ed72","Type":"ContainerDied","Data":"745ca1c7a682cfb70c52512a72a2f3a58645f69096b4b6d49bb3df47e1370c6d"} Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.434587 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="745ca1c7a682cfb70c52512a72a2f3a58645f69096b4b6d49bb3df47e1370c6d" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.482628 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.559566 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv52x\" (UniqueName: \"kubernetes.io/projected/e4672fe0-7762-447b-a91a-d553d445ed72-kube-api-access-mv52x\") pod \"e4672fe0-7762-447b-a91a-d553d445ed72\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.559675 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-utilities\") pod \"e4672fe0-7762-447b-a91a-d553d445ed72\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.559738 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-catalog-content\") pod \"e4672fe0-7762-447b-a91a-d553d445ed72\" (UID: \"e4672fe0-7762-447b-a91a-d553d445ed72\") " Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.560650 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-utilities" (OuterVolumeSpecName: "utilities") pod "e4672fe0-7762-447b-a91a-d553d445ed72" (UID: "e4672fe0-7762-447b-a91a-d553d445ed72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.567450 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4672fe0-7762-447b-a91a-d553d445ed72-kube-api-access-mv52x" (OuterVolumeSpecName: "kube-api-access-mv52x") pod "e4672fe0-7762-447b-a91a-d553d445ed72" (UID: "e4672fe0-7762-447b-a91a-d553d445ed72"). InnerVolumeSpecName "kube-api-access-mv52x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.617040 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4672fe0-7762-447b-a91a-d553d445ed72" (UID: "e4672fe0-7762-447b-a91a-d553d445ed72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.662095 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv52x\" (UniqueName: \"kubernetes.io/projected/e4672fe0-7762-447b-a91a-d553d445ed72-kube-api-access-mv52x\") on node \"crc\" DevicePath \"\"" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.662131 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:36:22 crc kubenswrapper[4904]: I1205 21:36:22.662141 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4672fe0-7762-447b-a91a-d553d445ed72-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:36:23 crc kubenswrapper[4904]: I1205 21:36:23.443033 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hq9sx" Dec 05 21:36:23 crc kubenswrapper[4904]: I1205 21:36:23.480576 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hq9sx"] Dec 05 21:36:23 crc kubenswrapper[4904]: I1205 21:36:23.492255 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hq9sx"] Dec 05 21:36:23 crc kubenswrapper[4904]: I1205 21:36:23.695266 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" path="/var/lib/kubelet/pods/e4672fe0-7762-447b-a91a-d553d445ed72/volumes" Dec 05 21:36:29 crc kubenswrapper[4904]: I1205 21:36:29.955547 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:36:29 crc kubenswrapper[4904]: I1205 21:36:29.956332 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:36:29 crc kubenswrapper[4904]: I1205 21:36:29.956424 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:36:29 crc kubenswrapper[4904]: I1205 21:36:29.957689 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7d7f09e13152e0a4faedadfe2a18bad5ec893c3b518c6efca98f32916ef57b3"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:36:29 crc kubenswrapper[4904]: I1205 21:36:29.957795 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://f7d7f09e13152e0a4faedadfe2a18bad5ec893c3b518c6efca98f32916ef57b3" gracePeriod=600 Dec 05 21:36:31 crc kubenswrapper[4904]: I1205 21:36:31.531469 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="f7d7f09e13152e0a4faedadfe2a18bad5ec893c3b518c6efca98f32916ef57b3" exitCode=0 Dec 05 21:36:31 crc kubenswrapper[4904]: I1205 21:36:31.531545 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"f7d7f09e13152e0a4faedadfe2a18bad5ec893c3b518c6efca98f32916ef57b3"} Dec 05 21:36:31 crc kubenswrapper[4904]: I1205 21:36:31.531956 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8"} Dec 05 21:36:31 crc kubenswrapper[4904]: I1205 21:36:31.532004 4904 scope.go:117] "RemoveContainer" containerID="94e67afcde674d544e475a7bc0d8ae7f1ed2a8f2366106ad1f5c07bab7395f4a" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.575397 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6qxpc"] Dec 05 21:37:56 crc kubenswrapper[4904]: E1205 21:37:56.576480 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="extract-utilities" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.576500 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="extract-utilities" Dec 05 21:37:56 crc kubenswrapper[4904]: E1205 21:37:56.576520 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="extract-content" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.576526 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="extract-content" Dec 05 21:37:56 crc kubenswrapper[4904]: E1205 21:37:56.576535 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="registry-server" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.576541 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="registry-server" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.576776 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4672fe0-7762-447b-a91a-d553d445ed72" containerName="registry-server" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.578510 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.608591 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qxpc"] Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.724865 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-catalog-content\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.725022 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl4dm\" (UniqueName: \"kubernetes.io/projected/c98e2f16-2c17-4558-aa14-051c2f355f09-kube-api-access-fl4dm\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.725113 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-utilities\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.828493 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl4dm\" (UniqueName: \"kubernetes.io/projected/c98e2f16-2c17-4558-aa14-051c2f355f09-kube-api-access-fl4dm\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.828632 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-utilities\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.828865 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-catalog-content\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.829232 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-utilities\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.829447 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-catalog-content\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.847572 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl4dm\" (UniqueName: \"kubernetes.io/projected/c98e2f16-2c17-4558-aa14-051c2f355f09-kube-api-access-fl4dm\") pod \"redhat-operators-6qxpc\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:56 crc kubenswrapper[4904]: I1205 21:37:56.899814 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.446899 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qxpc"] Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.566445 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-npvt2"] Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.568972 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.585162 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npvt2"] Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.602039 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-utilities\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.602174 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fpx8\" (UniqueName: \"kubernetes.io/projected/7a2c0f26-6730-4fff-84fb-d55695d15906-kube-api-access-4fpx8\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.602211 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-catalog-content\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.726576 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-utilities\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.726675 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fpx8\" (UniqueName: \"kubernetes.io/projected/7a2c0f26-6730-4fff-84fb-d55695d15906-kube-api-access-4fpx8\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.726725 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-catalog-content\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.727542 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-catalog-content\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.727777 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-utilities\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.754829 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fpx8\" (UniqueName: \"kubernetes.io/projected/7a2c0f26-6730-4fff-84fb-d55695d15906-kube-api-access-4fpx8\") pod \"community-operators-npvt2\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:57 crc kubenswrapper[4904]: I1205 21:37:57.919931 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:37:58 crc kubenswrapper[4904]: I1205 21:37:58.356911 4904 generic.go:334] "Generic (PLEG): container finished" podID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerID="fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab" exitCode=0 Dec 05 21:37:58 crc kubenswrapper[4904]: I1205 21:37:58.356951 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerDied","Data":"fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab"} Dec 05 21:37:58 crc kubenswrapper[4904]: I1205 21:37:58.356999 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerStarted","Data":"aeb5a96ecd452a744fd72db1ce7a536e9d8ff8010ed60ae819d3144ee714a081"} Dec 05 21:37:58 crc kubenswrapper[4904]: I1205 21:37:58.359561 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:37:58 crc kubenswrapper[4904]: I1205 21:37:58.477870 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npvt2"] Dec 05 21:37:59 crc kubenswrapper[4904]: I1205 21:37:59.368224 4904 generic.go:334] "Generic (PLEG): container finished" podID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerID="265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f" exitCode=0 Dec 05 21:37:59 crc kubenswrapper[4904]: I1205 21:37:59.368277 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerDied","Data":"265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f"} Dec 05 21:37:59 crc kubenswrapper[4904]: I1205 21:37:59.368590 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerStarted","Data":"bb97a49ac3e2157e796015a48f1c3e4a5f40104f7461507c426482962af700e1"} Dec 05 21:37:59 crc kubenswrapper[4904]: I1205 21:37:59.371600 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerStarted","Data":"7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19"} Dec 05 21:38:01 crc kubenswrapper[4904]: I1205 21:38:01.392620 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerStarted","Data":"0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4"} Dec 05 21:38:03 crc kubenswrapper[4904]: I1205 21:38:03.419322 4904 generic.go:334] "Generic (PLEG): container finished" podID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerID="7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19" exitCode=0 Dec 05 21:38:03 crc kubenswrapper[4904]: I1205 21:38:03.419547 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerDied","Data":"7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19"} Dec 05 21:38:03 crc kubenswrapper[4904]: I1205 21:38:03.423334 4904 generic.go:334] "Generic (PLEG): container finished" podID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerID="0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4" exitCode=0 Dec 05 21:38:03 crc kubenswrapper[4904]: I1205 21:38:03.423380 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerDied","Data":"0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4"} Dec 05 21:38:04 crc kubenswrapper[4904]: I1205 21:38:04.434950 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerStarted","Data":"1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce"} Dec 05 21:38:04 crc kubenswrapper[4904]: I1205 21:38:04.437303 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerStarted","Data":"66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182"} Dec 05 21:38:04 crc kubenswrapper[4904]: I1205 21:38:04.462712 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-npvt2" podStartSLOduration=2.926516569 podStartE2EDuration="7.462681671s" podCreationTimestamp="2025-12-05 21:37:57 +0000 UTC" firstStartedPulling="2025-12-05 21:37:59.370618624 +0000 UTC m=+5178.181834743" lastFinishedPulling="2025-12-05 21:38:03.906783736 +0000 UTC m=+5182.717999845" observedRunningTime="2025-12-05 21:38:04.455770352 +0000 UTC m=+5183.266986471" watchObservedRunningTime="2025-12-05 21:38:04.462681671 +0000 UTC m=+5183.273897780" Dec 05 21:38:04 crc kubenswrapper[4904]: I1205 21:38:04.475895 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6qxpc" podStartSLOduration=3.002824691 podStartE2EDuration="8.475874581s" podCreationTimestamp="2025-12-05 21:37:56 +0000 UTC" firstStartedPulling="2025-12-05 21:37:58.359351957 +0000 UTC m=+5177.170568066" lastFinishedPulling="2025-12-05 21:38:03.832401837 +0000 UTC m=+5182.643617956" observedRunningTime="2025-12-05 21:38:04.473035774 +0000 UTC m=+5183.284251913" watchObservedRunningTime="2025-12-05 21:38:04.475874581 +0000 UTC m=+5183.287090700" Dec 05 21:38:06 crc kubenswrapper[4904]: I1205 21:38:06.900355 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:38:06 crc kubenswrapper[4904]: I1205 21:38:06.900949 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:38:07 crc kubenswrapper[4904]: I1205 21:38:07.920817 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:38:07 crc kubenswrapper[4904]: I1205 21:38:07.921163 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:38:07 crc kubenswrapper[4904]: I1205 21:38:07.990185 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:38:08 crc kubenswrapper[4904]: I1205 21:38:08.034995 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6qxpc" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="registry-server" probeResult="failure" output=< Dec 05 21:38:08 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 21:38:08 crc kubenswrapper[4904]: > Dec 05 21:38:16 crc kubenswrapper[4904]: I1205 21:38:16.947802 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:38:17 crc kubenswrapper[4904]: I1205 21:38:17.008505 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:38:17 crc kubenswrapper[4904]: I1205 21:38:17.204784 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qxpc"] Dec 05 21:38:18 crc kubenswrapper[4904]: I1205 21:38:18.340210 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:38:18 crc kubenswrapper[4904]: I1205 21:38:18.571324 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6qxpc" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="registry-server" containerID="cri-o://66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182" gracePeriod=2 Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.035575 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.174231 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl4dm\" (UniqueName: \"kubernetes.io/projected/c98e2f16-2c17-4558-aa14-051c2f355f09-kube-api-access-fl4dm\") pod \"c98e2f16-2c17-4558-aa14-051c2f355f09\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.174673 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-utilities\") pod \"c98e2f16-2c17-4558-aa14-051c2f355f09\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.174842 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-catalog-content\") pod \"c98e2f16-2c17-4558-aa14-051c2f355f09\" (UID: \"c98e2f16-2c17-4558-aa14-051c2f355f09\") " Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.175564 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-utilities" (OuterVolumeSpecName: "utilities") pod "c98e2f16-2c17-4558-aa14-051c2f355f09" (UID: "c98e2f16-2c17-4558-aa14-051c2f355f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.180986 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98e2f16-2c17-4558-aa14-051c2f355f09-kube-api-access-fl4dm" (OuterVolumeSpecName: "kube-api-access-fl4dm") pod "c98e2f16-2c17-4558-aa14-051c2f355f09" (UID: "c98e2f16-2c17-4558-aa14-051c2f355f09"). InnerVolumeSpecName "kube-api-access-fl4dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.276788 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl4dm\" (UniqueName: \"kubernetes.io/projected/c98e2f16-2c17-4558-aa14-051c2f355f09-kube-api-access-fl4dm\") on node \"crc\" DevicePath \"\"" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.276823 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.294510 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c98e2f16-2c17-4558-aa14-051c2f355f09" (UID: "c98e2f16-2c17-4558-aa14-051c2f355f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.378051 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98e2f16-2c17-4558-aa14-051c2f355f09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.582994 4904 generic.go:334] "Generic (PLEG): container finished" podID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerID="66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182" exitCode=0 Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.583042 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerDied","Data":"66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182"} Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.583104 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qxpc" event={"ID":"c98e2f16-2c17-4558-aa14-051c2f355f09","Type":"ContainerDied","Data":"aeb5a96ecd452a744fd72db1ce7a536e9d8ff8010ed60ae819d3144ee714a081"} Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.583108 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qxpc" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.583156 4904 scope.go:117] "RemoveContainer" containerID="66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.608182 4904 scope.go:117] "RemoveContainer" containerID="7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.624150 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qxpc"] Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.635222 4904 scope.go:117] "RemoveContainer" containerID="fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.635294 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6qxpc"] Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.692167 4904 scope.go:117] "RemoveContainer" containerID="66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182" Dec 05 21:38:19 crc kubenswrapper[4904]: E1205 21:38:19.692622 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182\": container with ID starting with 66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182 not found: ID does not exist" containerID="66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.692699 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182"} err="failed to get container status \"66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182\": rpc error: code = NotFound desc = could not find container \"66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182\": container with ID starting with 66edf0f7ba96297f39d1db74ae2066411a992a48a6dc61e257db1871b12b7182 not found: ID does not exist" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.692746 4904 scope.go:117] "RemoveContainer" containerID="7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.692818 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" path="/var/lib/kubelet/pods/c98e2f16-2c17-4558-aa14-051c2f355f09/volumes" Dec 05 21:38:19 crc kubenswrapper[4904]: E1205 21:38:19.694115 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19\": container with ID starting with 7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19 not found: ID does not exist" containerID="7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.694145 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19"} err="failed to get container status \"7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19\": rpc error: code = NotFound desc = could not find container \"7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19\": container with ID starting with 7c38ab17b37c4cef209e2e7c981d1fa474b39488de26d06e5eb3eff844a3ec19 not found: ID does not exist" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.694167 4904 scope.go:117] "RemoveContainer" containerID="fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab" Dec 05 21:38:19 crc kubenswrapper[4904]: E1205 21:38:19.694453 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab\": container with ID starting with fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab not found: ID does not exist" containerID="fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab" Dec 05 21:38:19 crc kubenswrapper[4904]: I1205 21:38:19.694479 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab"} err="failed to get container status \"fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab\": rpc error: code = NotFound desc = could not find container \"fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab\": container with ID starting with fdda22e08c29dd3d9788e32ec063da6b0654e9486a75b75e5898ab85458599ab not found: ID does not exist" Dec 05 21:38:20 crc kubenswrapper[4904]: I1205 21:38:20.608584 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npvt2"] Dec 05 21:38:20 crc kubenswrapper[4904]: I1205 21:38:20.609426 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-npvt2" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="registry-server" containerID="cri-o://1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce" gracePeriod=2 Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.148764 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.321655 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fpx8\" (UniqueName: \"kubernetes.io/projected/7a2c0f26-6730-4fff-84fb-d55695d15906-kube-api-access-4fpx8\") pod \"7a2c0f26-6730-4fff-84fb-d55695d15906\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.321878 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-utilities\") pod \"7a2c0f26-6730-4fff-84fb-d55695d15906\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.321958 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-catalog-content\") pod \"7a2c0f26-6730-4fff-84fb-d55695d15906\" (UID: \"7a2c0f26-6730-4fff-84fb-d55695d15906\") " Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.322671 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-utilities" (OuterVolumeSpecName: "utilities") pod "7a2c0f26-6730-4fff-84fb-d55695d15906" (UID: "7a2c0f26-6730-4fff-84fb-d55695d15906"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.328340 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2c0f26-6730-4fff-84fb-d55695d15906-kube-api-access-4fpx8" (OuterVolumeSpecName: "kube-api-access-4fpx8") pod "7a2c0f26-6730-4fff-84fb-d55695d15906" (UID: "7a2c0f26-6730-4fff-84fb-d55695d15906"). InnerVolumeSpecName "kube-api-access-4fpx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.384657 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a2c0f26-6730-4fff-84fb-d55695d15906" (UID: "7a2c0f26-6730-4fff-84fb-d55695d15906"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.424645 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.424686 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a2c0f26-6730-4fff-84fb-d55695d15906-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.424702 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fpx8\" (UniqueName: \"kubernetes.io/projected/7a2c0f26-6730-4fff-84fb-d55695d15906-kube-api-access-4fpx8\") on node \"crc\" DevicePath \"\"" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.629105 4904 generic.go:334] "Generic (PLEG): container finished" podID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerID="1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce" exitCode=0 Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.629150 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerDied","Data":"1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce"} Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.629187 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npvt2" event={"ID":"7a2c0f26-6730-4fff-84fb-d55695d15906","Type":"ContainerDied","Data":"bb97a49ac3e2157e796015a48f1c3e4a5f40104f7461507c426482962af700e1"} Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.629206 4904 scope.go:117] "RemoveContainer" containerID="1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.629132 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npvt2" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.662787 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npvt2"] Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.664223 4904 scope.go:117] "RemoveContainer" containerID="0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.685681 4904 scope.go:117] "RemoveContainer" containerID="265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.701806 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-npvt2"] Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.760329 4904 scope.go:117] "RemoveContainer" containerID="1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce" Dec 05 21:38:21 crc kubenswrapper[4904]: E1205 21:38:21.763344 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce\": container with ID starting with 1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce not found: ID does not exist" containerID="1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.763388 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce"} err="failed to get container status \"1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce\": rpc error: code = NotFound desc = could not find container \"1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce\": container with ID starting with 1a7a94268dbe16c9585c40a46d7da89d368a1a93943042df8e4226536e9248ce not found: ID does not exist" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.763420 4904 scope.go:117] "RemoveContainer" containerID="0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4" Dec 05 21:38:21 crc kubenswrapper[4904]: E1205 21:38:21.763887 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4\": container with ID starting with 0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4 not found: ID does not exist" containerID="0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.763996 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4"} err="failed to get container status \"0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4\": rpc error: code = NotFound desc = could not find container \"0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4\": container with ID starting with 0f15f100a5bfae758c5fcc906e346a15e9a2a60147f7367b58652637907146a4 not found: ID does not exist" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.764076 4904 scope.go:117] "RemoveContainer" containerID="265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f" Dec 05 21:38:21 crc kubenswrapper[4904]: E1205 21:38:21.764415 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f\": container with ID starting with 265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f not found: ID does not exist" containerID="265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f" Dec 05 21:38:21 crc kubenswrapper[4904]: I1205 21:38:21.764457 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f"} err="failed to get container status \"265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f\": rpc error: code = NotFound desc = could not find container \"265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f\": container with ID starting with 265f6c886c336e7b465bd64165464a82260aedc2091217dd0dcfa6977da52d4f not found: ID does not exist" Dec 05 21:38:23 crc kubenswrapper[4904]: I1205 21:38:23.694983 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" path="/var/lib/kubelet/pods/7a2c0f26-6730-4fff-84fb-d55695d15906/volumes" Dec 05 21:38:59 crc kubenswrapper[4904]: I1205 21:38:59.955795 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:38:59 crc kubenswrapper[4904]: I1205 21:38:59.956454 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:39:29 crc kubenswrapper[4904]: I1205 21:39:29.955560 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:39:29 crc kubenswrapper[4904]: I1205 21:39:29.956011 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:39:59 crc kubenswrapper[4904]: I1205 21:39:59.956286 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:39:59 crc kubenswrapper[4904]: I1205 21:39:59.956920 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:39:59 crc kubenswrapper[4904]: I1205 21:39:59.956974 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:39:59 crc kubenswrapper[4904]: I1205 21:39:59.957811 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:39:59 crc kubenswrapper[4904]: I1205 21:39:59.957876 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" gracePeriod=600 Dec 05 21:40:00 crc kubenswrapper[4904]: E1205 21:40:00.581027 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc24b64_e25f_4b55_9123_295388685e7a.slice/crio-conmon-d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:40:00 crc kubenswrapper[4904]: I1205 21:40:00.602318 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" exitCode=0 Dec 05 21:40:00 crc kubenswrapper[4904]: I1205 21:40:00.602368 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8"} Dec 05 21:40:00 crc kubenswrapper[4904]: I1205 21:40:00.602501 4904 scope.go:117] "RemoveContainer" containerID="f7d7f09e13152e0a4faedadfe2a18bad5ec893c3b518c6efca98f32916ef57b3" Dec 05 21:40:00 crc kubenswrapper[4904]: E1205 21:40:00.920161 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:40:01 crc kubenswrapper[4904]: I1205 21:40:01.626290 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:40:01 crc kubenswrapper[4904]: E1205 21:40:01.626611 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:40:15 crc kubenswrapper[4904]: I1205 21:40:15.681910 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:40:15 crc kubenswrapper[4904]: E1205 21:40:15.682649 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:40:28 crc kubenswrapper[4904]: I1205 21:40:28.682569 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:40:28 crc kubenswrapper[4904]: E1205 21:40:28.683720 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:40:43 crc kubenswrapper[4904]: I1205 21:40:43.682705 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:40:43 crc kubenswrapper[4904]: E1205 21:40:43.684018 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:40:57 crc kubenswrapper[4904]: I1205 21:40:57.681442 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:40:57 crc kubenswrapper[4904]: E1205 21:40:57.682200 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:41:08 crc kubenswrapper[4904]: I1205 21:41:08.681635 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:41:08 crc kubenswrapper[4904]: E1205 21:41:08.684799 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:41:23 crc kubenswrapper[4904]: I1205 21:41:23.681286 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:41:23 crc kubenswrapper[4904]: E1205 21:41:23.682248 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:41:38 crc kubenswrapper[4904]: I1205 21:41:38.681523 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:41:38 crc kubenswrapper[4904]: E1205 21:41:38.682365 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:41:51 crc kubenswrapper[4904]: I1205 21:41:51.688265 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:41:51 crc kubenswrapper[4904]: E1205 21:41:51.689148 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:42:04 crc kubenswrapper[4904]: I1205 21:42:04.682668 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:42:04 crc kubenswrapper[4904]: E1205 21:42:04.683642 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:42:19 crc kubenswrapper[4904]: I1205 21:42:19.682133 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:42:19 crc kubenswrapper[4904]: E1205 21:42:19.682848 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:42:21 crc kubenswrapper[4904]: I1205 21:42:21.442171 4904 scope.go:117] "RemoveContainer" containerID="bf3663ce4e7106d3ab1fc515bd63bf3e211850dd53bd8e5a588e333e5a700a5a" Dec 05 21:42:21 crc kubenswrapper[4904]: I1205 21:42:21.477297 4904 scope.go:117] "RemoveContainer" containerID="d3f5f33d02337e01145560b52232e4be3f1b715b28e27a29ba158644acd438f0" Dec 05 21:42:21 crc kubenswrapper[4904]: I1205 21:42:21.542598 4904 scope.go:117] "RemoveContainer" containerID="e59dfada78b07f3846d6f400e28205242a09829b6bb44f4133eae0f425aa38b9" Dec 05 21:42:33 crc kubenswrapper[4904]: I1205 21:42:33.681974 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:42:33 crc kubenswrapper[4904]: E1205 21:42:33.682856 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:42:48 crc kubenswrapper[4904]: I1205 21:42:48.681021 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:42:48 crc kubenswrapper[4904]: E1205 21:42:48.682807 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:43:00 crc kubenswrapper[4904]: I1205 21:43:00.681403 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:43:00 crc kubenswrapper[4904]: E1205 21:43:00.682158 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:43:14 crc kubenswrapper[4904]: I1205 21:43:14.681920 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:43:14 crc kubenswrapper[4904]: E1205 21:43:14.682826 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:43:26 crc kubenswrapper[4904]: I1205 21:43:26.681750 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:43:26 crc kubenswrapper[4904]: E1205 21:43:26.682612 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.897198 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gwl67"] Dec 05 21:43:32 crc kubenswrapper[4904]: E1205 21:43:32.898091 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="extract-utilities" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898105 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="extract-utilities" Dec 05 21:43:32 crc kubenswrapper[4904]: E1205 21:43:32.898121 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="registry-server" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898127 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="registry-server" Dec 05 21:43:32 crc kubenswrapper[4904]: E1205 21:43:32.898152 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="extract-utilities" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898158 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="extract-utilities" Dec 05 21:43:32 crc kubenswrapper[4904]: E1205 21:43:32.898171 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="extract-content" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898176 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="extract-content" Dec 05 21:43:32 crc kubenswrapper[4904]: E1205 21:43:32.898188 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="registry-server" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898194 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="registry-server" Dec 05 21:43:32 crc kubenswrapper[4904]: E1205 21:43:32.898206 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="extract-content" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898211 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="extract-content" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898397 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98e2f16-2c17-4558-aa14-051c2f355f09" containerName="registry-server" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.898425 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2c0f26-6730-4fff-84fb-d55695d15906" containerName="registry-server" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.900198 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.927493 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwl67"] Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.977232 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr9dl\" (UniqueName: \"kubernetes.io/projected/e7a0f7af-b855-41ce-b152-1115c80a3756-kube-api-access-wr9dl\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.977313 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-utilities\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:32 crc kubenswrapper[4904]: I1205 21:43:32.977458 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-catalog-content\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.080277 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-catalog-content\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.080818 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr9dl\" (UniqueName: \"kubernetes.io/projected/e7a0f7af-b855-41ce-b152-1115c80a3756-kube-api-access-wr9dl\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.080954 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-utilities\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.080864 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-catalog-content\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.081400 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-utilities\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.100420 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr9dl\" (UniqueName: \"kubernetes.io/projected/e7a0f7af-b855-41ce-b152-1115c80a3756-kube-api-access-wr9dl\") pod \"redhat-marketplace-gwl67\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.223871 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.721897 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwl67"] Dec 05 21:43:33 crc kubenswrapper[4904]: I1205 21:43:33.869553 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerStarted","Data":"b495fed5de4a6757a7257ba0bcd3bbdd8f097345a60dc9080633ad0df016a65e"} Dec 05 21:43:34 crc kubenswrapper[4904]: I1205 21:43:34.881036 4904 generic.go:334] "Generic (PLEG): container finished" podID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerID="67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b" exitCode=0 Dec 05 21:43:34 crc kubenswrapper[4904]: I1205 21:43:34.881113 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerDied","Data":"67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b"} Dec 05 21:43:34 crc kubenswrapper[4904]: I1205 21:43:34.883203 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:43:35 crc kubenswrapper[4904]: I1205 21:43:35.898556 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerStarted","Data":"0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602"} Dec 05 21:43:36 crc kubenswrapper[4904]: I1205 21:43:36.911566 4904 generic.go:334] "Generic (PLEG): container finished" podID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerID="0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602" exitCode=0 Dec 05 21:43:36 crc kubenswrapper[4904]: I1205 21:43:36.911720 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerDied","Data":"0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602"} Dec 05 21:43:37 crc kubenswrapper[4904]: I1205 21:43:37.925763 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerStarted","Data":"0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04"} Dec 05 21:43:37 crc kubenswrapper[4904]: I1205 21:43:37.957810 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gwl67" podStartSLOduration=3.424070311 podStartE2EDuration="5.957756893s" podCreationTimestamp="2025-12-05 21:43:32 +0000 UTC" firstStartedPulling="2025-12-05 21:43:34.882943569 +0000 UTC m=+5513.694159678" lastFinishedPulling="2025-12-05 21:43:37.416630151 +0000 UTC m=+5516.227846260" observedRunningTime="2025-12-05 21:43:37.944675608 +0000 UTC m=+5516.755891737" watchObservedRunningTime="2025-12-05 21:43:37.957756893 +0000 UTC m=+5516.768973002" Dec 05 21:43:40 crc kubenswrapper[4904]: I1205 21:43:40.681669 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:43:40 crc kubenswrapper[4904]: E1205 21:43:40.682238 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:43:43 crc kubenswrapper[4904]: I1205 21:43:43.225350 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:43 crc kubenswrapper[4904]: I1205 21:43:43.226618 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:43 crc kubenswrapper[4904]: I1205 21:43:43.281241 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:44 crc kubenswrapper[4904]: I1205 21:43:44.074690 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:44 crc kubenswrapper[4904]: I1205 21:43:44.142929 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwl67"] Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.044450 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gwl67" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="registry-server" containerID="cri-o://0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04" gracePeriod=2 Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.571186 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.683170 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-catalog-content\") pod \"e7a0f7af-b855-41ce-b152-1115c80a3756\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.683601 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-utilities\") pod \"e7a0f7af-b855-41ce-b152-1115c80a3756\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.683648 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr9dl\" (UniqueName: \"kubernetes.io/projected/e7a0f7af-b855-41ce-b152-1115c80a3756-kube-api-access-wr9dl\") pod \"e7a0f7af-b855-41ce-b152-1115c80a3756\" (UID: \"e7a0f7af-b855-41ce-b152-1115c80a3756\") " Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.685322 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-utilities" (OuterVolumeSpecName: "utilities") pod "e7a0f7af-b855-41ce-b152-1115c80a3756" (UID: "e7a0f7af-b855-41ce-b152-1115c80a3756"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.693281 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a0f7af-b855-41ce-b152-1115c80a3756-kube-api-access-wr9dl" (OuterVolumeSpecName: "kube-api-access-wr9dl") pod "e7a0f7af-b855-41ce-b152-1115c80a3756" (UID: "e7a0f7af-b855-41ce-b152-1115c80a3756"). InnerVolumeSpecName "kube-api-access-wr9dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.703512 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7a0f7af-b855-41ce-b152-1115c80a3756" (UID: "e7a0f7af-b855-41ce-b152-1115c80a3756"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.786130 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.786162 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr9dl\" (UniqueName: \"kubernetes.io/projected/e7a0f7af-b855-41ce-b152-1115c80a3756-kube-api-access-wr9dl\") on node \"crc\" DevicePath \"\"" Dec 05 21:43:46 crc kubenswrapper[4904]: I1205 21:43:46.786176 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a0f7af-b855-41ce-b152-1115c80a3756-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.055469 4904 generic.go:334] "Generic (PLEG): container finished" podID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerID="0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04" exitCode=0 Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.055520 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerDied","Data":"0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04"} Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.055555 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwl67" event={"ID":"e7a0f7af-b855-41ce-b152-1115c80a3756","Type":"ContainerDied","Data":"b495fed5de4a6757a7257ba0bcd3bbdd8f097345a60dc9080633ad0df016a65e"} Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.055577 4904 scope.go:117] "RemoveContainer" containerID="0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.055646 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwl67" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.105920 4904 scope.go:117] "RemoveContainer" containerID="0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.137750 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwl67"] Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.144581 4904 scope.go:117] "RemoveContainer" containerID="67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.159841 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwl67"] Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.203161 4904 scope.go:117] "RemoveContainer" containerID="0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04" Dec 05 21:43:47 crc kubenswrapper[4904]: E1205 21:43:47.204533 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04\": container with ID starting with 0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04 not found: ID does not exist" containerID="0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.204572 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04"} err="failed to get container status \"0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04\": rpc error: code = NotFound desc = could not find container \"0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04\": container with ID starting with 0fbd0e4aae66924dbc4ca7eb5564d296d8a3f6048b552cc5cd25909810752c04 not found: ID does not exist" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.204598 4904 scope.go:117] "RemoveContainer" containerID="0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602" Dec 05 21:43:47 crc kubenswrapper[4904]: E1205 21:43:47.204905 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602\": container with ID starting with 0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602 not found: ID does not exist" containerID="0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.204967 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602"} err="failed to get container status \"0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602\": rpc error: code = NotFound desc = could not find container \"0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602\": container with ID starting with 0153eacd82558ddb38eb134d0bad6832b1ee05434f89a95f5ae25764b6372602 not found: ID does not exist" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.204986 4904 scope.go:117] "RemoveContainer" containerID="67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b" Dec 05 21:43:47 crc kubenswrapper[4904]: E1205 21:43:47.205296 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b\": container with ID starting with 67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b not found: ID does not exist" containerID="67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.205342 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b"} err="failed to get container status \"67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b\": rpc error: code = NotFound desc = could not find container \"67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b\": container with ID starting with 67c6d4efd34400c754916516f5215c017c7f32a652ff928e4111b1f0bd724d0b not found: ID does not exist" Dec 05 21:43:47 crc kubenswrapper[4904]: I1205 21:43:47.695596 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" path="/var/lib/kubelet/pods/e7a0f7af-b855-41ce-b152-1115c80a3756/volumes" Dec 05 21:43:52 crc kubenswrapper[4904]: I1205 21:43:52.682174 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:43:52 crc kubenswrapper[4904]: E1205 21:43:52.682712 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:44:06 crc kubenswrapper[4904]: I1205 21:44:06.682281 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:44:06 crc kubenswrapper[4904]: E1205 21:44:06.683203 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:44:16 crc kubenswrapper[4904]: I1205 21:44:16.414895 4904 generic.go:334] "Generic (PLEG): container finished" podID="45f624ec-9d5e-41f1-ba5b-e81c2b84c532" containerID="a13d80b221473979d8dc16d7bd38389b4b352cf98a7b74615c1fc547f1dda233" exitCode=0 Dec 05 21:44:16 crc kubenswrapper[4904]: I1205 21:44:16.414972 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"45f624ec-9d5e-41f1-ba5b-e81c2b84c532","Type":"ContainerDied","Data":"a13d80b221473979d8dc16d7bd38389b4b352cf98a7b74615c1fc547f1dda233"} Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.822405 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.934901 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935318 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935390 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-workdir\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935448 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ca-certs\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935503 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2vpb\" (UniqueName: \"kubernetes.io/projected/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-kube-api-access-c2vpb\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935554 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-temporary\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935636 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config-secret\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935735 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-config-data\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.935769 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ssh-key\") pod \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\" (UID: \"45f624ec-9d5e-41f1-ba5b-e81c2b84c532\") " Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.936878 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.937489 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-config-data" (OuterVolumeSpecName: "config-data") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.946467 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-kube-api-access-c2vpb" (OuterVolumeSpecName: "kube-api-access-c2vpb") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "kube-api-access-c2vpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.957451 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.962481 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.977122 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.982663 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:44:17 crc kubenswrapper[4904]: I1205 21:44:17.997866 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.004438 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "45f624ec-9d5e-41f1-ba5b-e81c2b84c532" (UID: "45f624ec-9d5e-41f1-ba5b-e81c2b84c532"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038270 4904 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038374 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038438 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038547 4904 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038646 4904 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038786 4904 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038848 4904 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.038967 4904 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.039026 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2vpb\" (UniqueName: \"kubernetes.io/projected/45f624ec-9d5e-41f1-ba5b-e81c2b84c532-kube-api-access-c2vpb\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.059748 4904 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.140975 4904 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.455513 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"45f624ec-9d5e-41f1-ba5b-e81c2b84c532","Type":"ContainerDied","Data":"95d01893c247b7588d110f72f471658397e76f54aa022d4ef163fd026e76eb90"} Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.455585 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d01893c247b7588d110f72f471658397e76f54aa022d4ef163fd026e76eb90" Dec 05 21:44:18 crc kubenswrapper[4904]: I1205 21:44:18.455647 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 21:44:21 crc kubenswrapper[4904]: I1205 21:44:21.698120 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:44:21 crc kubenswrapper[4904]: E1205 21:44:21.699201 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.037912 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 21:44:27 crc kubenswrapper[4904]: E1205 21:44:27.039655 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="registry-server" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.039688 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="registry-server" Dec 05 21:44:27 crc kubenswrapper[4904]: E1205 21:44:27.039716 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f624ec-9d5e-41f1-ba5b-e81c2b84c532" containerName="tempest-tests-tempest-tests-runner" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.039732 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f624ec-9d5e-41f1-ba5b-e81c2b84c532" containerName="tempest-tests-tempest-tests-runner" Dec 05 21:44:27 crc kubenswrapper[4904]: E1205 21:44:27.039788 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="extract-content" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.039804 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="extract-content" Dec 05 21:44:27 crc kubenswrapper[4904]: E1205 21:44:27.039835 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="extract-utilities" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.039850 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="extract-utilities" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.040363 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f624ec-9d5e-41f1-ba5b-e81c2b84c532" containerName="tempest-tests-tempest-tests-runner" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.040409 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a0f7af-b855-41ce-b152-1115c80a3756" containerName="registry-server" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.041997 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.044644 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b9jhq" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.065285 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.119289 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229dv\" (UniqueName: \"kubernetes.io/projected/31c70021-e6df-4ac6-b79e-402f24a13112-kube-api-access-229dv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.119396 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.221618 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.221863 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229dv\" (UniqueName: \"kubernetes.io/projected/31c70021-e6df-4ac6-b79e-402f24a13112-kube-api-access-229dv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.222081 4904 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.249019 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229dv\" (UniqueName: \"kubernetes.io/projected/31c70021-e6df-4ac6-b79e-402f24a13112-kube-api-access-229dv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.251500 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"31c70021-e6df-4ac6-b79e-402f24a13112\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.370646 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:44:27 crc kubenswrapper[4904]: I1205 21:44:27.834725 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 21:44:28 crc kubenswrapper[4904]: I1205 21:44:28.568195 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"31c70021-e6df-4ac6-b79e-402f24a13112","Type":"ContainerStarted","Data":"91cf064f7e0552772f1e47029ea955c32f2bb59a1cdd8f4934c23805faeb137d"} Dec 05 21:44:29 crc kubenswrapper[4904]: I1205 21:44:29.581842 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"31c70021-e6df-4ac6-b79e-402f24a13112","Type":"ContainerStarted","Data":"c2d985232a6529ae854ea68b7e5f38df77ffa9eb37ffccf05fe783cad93c62be"} Dec 05 21:44:29 crc kubenswrapper[4904]: I1205 21:44:29.620220 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.696063017 podStartE2EDuration="2.620183389s" podCreationTimestamp="2025-12-05 21:44:27 +0000 UTC" firstStartedPulling="2025-12-05 21:44:27.839202945 +0000 UTC m=+5566.650419054" lastFinishedPulling="2025-12-05 21:44:28.763323307 +0000 UTC m=+5567.574539426" observedRunningTime="2025-12-05 21:44:29.601508591 +0000 UTC m=+5568.412724730" watchObservedRunningTime="2025-12-05 21:44:29.620183389 +0000 UTC m=+5568.431399548" Dec 05 21:44:32 crc kubenswrapper[4904]: I1205 21:44:32.681317 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:44:32 crc kubenswrapper[4904]: E1205 21:44:32.681885 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:44:46 crc kubenswrapper[4904]: I1205 21:44:46.681880 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:44:46 crc kubenswrapper[4904]: E1205 21:44:46.684955 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.557949 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pspg6/must-gather-hsd7g"] Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.561127 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.565393 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pspg6"/"openshift-service-ca.crt" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.565733 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pspg6"/"kube-root-ca.crt" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.566259 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pspg6"/"default-dockercfg-9m92v" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.574735 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pspg6/must-gather-hsd7g"] Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.690617 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbfs\" (UniqueName: \"kubernetes.io/projected/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-kube-api-access-fcbfs\") pod \"must-gather-hsd7g\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.691002 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-must-gather-output\") pod \"must-gather-hsd7g\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.792916 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-must-gather-output\") pod \"must-gather-hsd7g\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.793387 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-must-gather-output\") pod \"must-gather-hsd7g\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.794459 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbfs\" (UniqueName: \"kubernetes.io/projected/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-kube-api-access-fcbfs\") pod \"must-gather-hsd7g\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.815861 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbfs\" (UniqueName: \"kubernetes.io/projected/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-kube-api-access-fcbfs\") pod \"must-gather-hsd7g\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:54 crc kubenswrapper[4904]: I1205 21:44:54.883665 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:44:55 crc kubenswrapper[4904]: I1205 21:44:55.402496 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pspg6/must-gather-hsd7g"] Dec 05 21:44:56 crc kubenswrapper[4904]: I1205 21:44:56.006971 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/must-gather-hsd7g" event={"ID":"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906","Type":"ContainerStarted","Data":"8d790f2576ac02d8a563008e81ed7d5d8fc0881439b2ec42107ec282d5997caf"} Dec 05 21:44:58 crc kubenswrapper[4904]: I1205 21:44:58.682828 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:44:58 crc kubenswrapper[4904]: E1205 21:44:58.683541 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.152333 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds"] Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.154251 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.157630 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.163324 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.179453 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds"] Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.213817 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/211b168f-6300-4154-8c9b-63e73503c9c1-secret-volume\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.214359 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211b168f-6300-4154-8c9b-63e73503c9c1-config-volume\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.214525 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8mp\" (UniqueName: \"kubernetes.io/projected/211b168f-6300-4154-8c9b-63e73503c9c1-kube-api-access-2z8mp\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.316462 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/211b168f-6300-4154-8c9b-63e73503c9c1-secret-volume\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.316584 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211b168f-6300-4154-8c9b-63e73503c9c1-config-volume\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.316658 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8mp\" (UniqueName: \"kubernetes.io/projected/211b168f-6300-4154-8c9b-63e73503c9c1-kube-api-access-2z8mp\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.317632 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211b168f-6300-4154-8c9b-63e73503c9c1-config-volume\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.322211 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/211b168f-6300-4154-8c9b-63e73503c9c1-secret-volume\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.346486 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8mp\" (UniqueName: \"kubernetes.io/projected/211b168f-6300-4154-8c9b-63e73503c9c1-kube-api-access-2z8mp\") pod \"collect-profiles-29416185-zg6ds\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:00 crc kubenswrapper[4904]: I1205 21:45:00.489250 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:02 crc kubenswrapper[4904]: I1205 21:45:02.086315 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/must-gather-hsd7g" event={"ID":"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906","Type":"ContainerStarted","Data":"5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a"} Dec 05 21:45:02 crc kubenswrapper[4904]: I1205 21:45:02.093399 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds"] Dec 05 21:45:03 crc kubenswrapper[4904]: I1205 21:45:03.101025 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/must-gather-hsd7g" event={"ID":"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906","Type":"ContainerStarted","Data":"ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9"} Dec 05 21:45:03 crc kubenswrapper[4904]: I1205 21:45:03.103448 4904 generic.go:334] "Generic (PLEG): container finished" podID="211b168f-6300-4154-8c9b-63e73503c9c1" containerID="b5223b3adb61373e97dcb5e2811038b02737ac57a00b46da53e5ce50c86134cb" exitCode=0 Dec 05 21:45:03 crc kubenswrapper[4904]: I1205 21:45:03.103498 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" event={"ID":"211b168f-6300-4154-8c9b-63e73503c9c1","Type":"ContainerDied","Data":"b5223b3adb61373e97dcb5e2811038b02737ac57a00b46da53e5ce50c86134cb"} Dec 05 21:45:03 crc kubenswrapper[4904]: I1205 21:45:03.103527 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" event={"ID":"211b168f-6300-4154-8c9b-63e73503c9c1","Type":"ContainerStarted","Data":"f0d7fffc2ebd686c96a1589dd2e16e7e986eac945a29c4468133bf33447b3cfd"} Dec 05 21:45:03 crc kubenswrapper[4904]: I1205 21:45:03.135306 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pspg6/must-gather-hsd7g" podStartSLOduration=2.901405316 podStartE2EDuration="9.135271185s" podCreationTimestamp="2025-12-05 21:44:54 +0000 UTC" firstStartedPulling="2025-12-05 21:44:55.398664712 +0000 UTC m=+5594.209880821" lastFinishedPulling="2025-12-05 21:45:01.632530571 +0000 UTC m=+5600.443746690" observedRunningTime="2025-12-05 21:45:03.11600084 +0000 UTC m=+5601.927216969" watchObservedRunningTime="2025-12-05 21:45:03.135271185 +0000 UTC m=+5601.946487364" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.481598 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.613199 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/211b168f-6300-4154-8c9b-63e73503c9c1-secret-volume\") pod \"211b168f-6300-4154-8c9b-63e73503c9c1\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.613600 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z8mp\" (UniqueName: \"kubernetes.io/projected/211b168f-6300-4154-8c9b-63e73503c9c1-kube-api-access-2z8mp\") pod \"211b168f-6300-4154-8c9b-63e73503c9c1\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.613774 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211b168f-6300-4154-8c9b-63e73503c9c1-config-volume\") pod \"211b168f-6300-4154-8c9b-63e73503c9c1\" (UID: \"211b168f-6300-4154-8c9b-63e73503c9c1\") " Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.614258 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211b168f-6300-4154-8c9b-63e73503c9c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "211b168f-6300-4154-8c9b-63e73503c9c1" (UID: "211b168f-6300-4154-8c9b-63e73503c9c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.620561 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211b168f-6300-4154-8c9b-63e73503c9c1-kube-api-access-2z8mp" (OuterVolumeSpecName: "kube-api-access-2z8mp") pod "211b168f-6300-4154-8c9b-63e73503c9c1" (UID: "211b168f-6300-4154-8c9b-63e73503c9c1"). InnerVolumeSpecName "kube-api-access-2z8mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.633215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211b168f-6300-4154-8c9b-63e73503c9c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "211b168f-6300-4154-8c9b-63e73503c9c1" (UID: "211b168f-6300-4154-8c9b-63e73503c9c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.718731 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/211b168f-6300-4154-8c9b-63e73503c9c1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.718801 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z8mp\" (UniqueName: \"kubernetes.io/projected/211b168f-6300-4154-8c9b-63e73503c9c1-kube-api-access-2z8mp\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:04 crc kubenswrapper[4904]: I1205 21:45:04.718816 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211b168f-6300-4154-8c9b-63e73503c9c1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.130926 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" event={"ID":"211b168f-6300-4154-8c9b-63e73503c9c1","Type":"ContainerDied","Data":"f0d7fffc2ebd686c96a1589dd2e16e7e986eac945a29c4468133bf33447b3cfd"} Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.131347 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0d7fffc2ebd686c96a1589dd2e16e7e986eac945a29c4468133bf33447b3cfd" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.130972 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-zg6ds" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.565036 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4"] Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.574553 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-kqqf4"] Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.694992 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0" path="/var/lib/kubelet/pods/d43cc048-7fbb-482a-9ffd-b0ce8c7df3d0/volumes" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.901460 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pspg6/crc-debug-gf2lx"] Dec 05 21:45:05 crc kubenswrapper[4904]: E1205 21:45:05.901896 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211b168f-6300-4154-8c9b-63e73503c9c1" containerName="collect-profiles" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.901913 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="211b168f-6300-4154-8c9b-63e73503c9c1" containerName="collect-profiles" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.902133 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="211b168f-6300-4154-8c9b-63e73503c9c1" containerName="collect-profiles" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.902810 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.942564 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7929ee32-233e-4989-87e4-e7259266e3f5-host\") pod \"crc-debug-gf2lx\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:05 crc kubenswrapper[4904]: I1205 21:45:05.942788 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89jw6\" (UniqueName: \"kubernetes.io/projected/7929ee32-233e-4989-87e4-e7259266e3f5-kube-api-access-89jw6\") pod \"crc-debug-gf2lx\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:06 crc kubenswrapper[4904]: I1205 21:45:06.044333 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89jw6\" (UniqueName: \"kubernetes.io/projected/7929ee32-233e-4989-87e4-e7259266e3f5-kube-api-access-89jw6\") pod \"crc-debug-gf2lx\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:06 crc kubenswrapper[4904]: I1205 21:45:06.044391 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7929ee32-233e-4989-87e4-e7259266e3f5-host\") pod \"crc-debug-gf2lx\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:06 crc kubenswrapper[4904]: I1205 21:45:06.044530 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7929ee32-233e-4989-87e4-e7259266e3f5-host\") pod \"crc-debug-gf2lx\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:06 crc kubenswrapper[4904]: I1205 21:45:06.066286 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89jw6\" (UniqueName: \"kubernetes.io/projected/7929ee32-233e-4989-87e4-e7259266e3f5-kube-api-access-89jw6\") pod \"crc-debug-gf2lx\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:06 crc kubenswrapper[4904]: I1205 21:45:06.218968 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:45:07 crc kubenswrapper[4904]: I1205 21:45:07.156372 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" event={"ID":"7929ee32-233e-4989-87e4-e7259266e3f5","Type":"ContainerStarted","Data":"3e1cf610a192bf6fddbf07c1b25accdd50c1598cde54432f21f0c7bbe765041e"} Dec 05 21:45:12 crc kubenswrapper[4904]: I1205 21:45:12.681277 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:45:17 crc kubenswrapper[4904]: I1205 21:45:17.288339 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"f4149179d4736f80eac61f13fa0487b54479026079b2ff33c6d0c328533eef98"} Dec 05 21:45:17 crc kubenswrapper[4904]: I1205 21:45:17.297943 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" event={"ID":"7929ee32-233e-4989-87e4-e7259266e3f5","Type":"ContainerStarted","Data":"783c3f1bf42a6327462ca6733f147cec1efeb6a21ff8ed57ad826867c1021406"} Dec 05 21:45:17 crc kubenswrapper[4904]: I1205 21:45:17.407500 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" podStartSLOduration=2.065935782 podStartE2EDuration="12.407465066s" podCreationTimestamp="2025-12-05 21:45:05 +0000 UTC" firstStartedPulling="2025-12-05 21:45:06.253946371 +0000 UTC m=+5605.065162480" lastFinishedPulling="2025-12-05 21:45:16.595475645 +0000 UTC m=+5615.406691764" observedRunningTime="2025-12-05 21:45:17.329442803 +0000 UTC m=+5616.140658912" watchObservedRunningTime="2025-12-05 21:45:17.407465066 +0000 UTC m=+5616.218681175" Dec 05 21:45:21 crc kubenswrapper[4904]: I1205 21:45:21.731908 4904 scope.go:117] "RemoveContainer" containerID="a2f3f42c506991ec0e78577b8e40e1ba8b240ab8631d1fa0816cb8800d2faedf" Dec 05 21:46:07 crc kubenswrapper[4904]: I1205 21:46:07.845481 4904 generic.go:334] "Generic (PLEG): container finished" podID="7929ee32-233e-4989-87e4-e7259266e3f5" containerID="783c3f1bf42a6327462ca6733f147cec1efeb6a21ff8ed57ad826867c1021406" exitCode=0 Dec 05 21:46:07 crc kubenswrapper[4904]: I1205 21:46:07.845577 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" event={"ID":"7929ee32-233e-4989-87e4-e7259266e3f5","Type":"ContainerDied","Data":"783c3f1bf42a6327462ca6733f147cec1efeb6a21ff8ed57ad826867c1021406"} Dec 05 21:46:08 crc kubenswrapper[4904]: I1205 21:46:08.996307 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.024779 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7929ee32-233e-4989-87e4-e7259266e3f5-host\") pod \"7929ee32-233e-4989-87e4-e7259266e3f5\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.024966 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89jw6\" (UniqueName: \"kubernetes.io/projected/7929ee32-233e-4989-87e4-e7259266e3f5-kube-api-access-89jw6\") pod \"7929ee32-233e-4989-87e4-e7259266e3f5\" (UID: \"7929ee32-233e-4989-87e4-e7259266e3f5\") " Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.025693 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7929ee32-233e-4989-87e4-e7259266e3f5-host" (OuterVolumeSpecName: "host") pod "7929ee32-233e-4989-87e4-e7259266e3f5" (UID: "7929ee32-233e-4989-87e4-e7259266e3f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.031776 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7929ee32-233e-4989-87e4-e7259266e3f5-kube-api-access-89jw6" (OuterVolumeSpecName: "kube-api-access-89jw6") pod "7929ee32-233e-4989-87e4-e7259266e3f5" (UID: "7929ee32-233e-4989-87e4-e7259266e3f5"). InnerVolumeSpecName "kube-api-access-89jw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.039454 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pspg6/crc-debug-gf2lx"] Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.050344 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pspg6/crc-debug-gf2lx"] Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.127315 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89jw6\" (UniqueName: \"kubernetes.io/projected/7929ee32-233e-4989-87e4-e7259266e3f5-kube-api-access-89jw6\") on node \"crc\" DevicePath \"\"" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.127343 4904 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7929ee32-233e-4989-87e4-e7259266e3f5-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.699695 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7929ee32-233e-4989-87e4-e7259266e3f5" path="/var/lib/kubelet/pods/7929ee32-233e-4989-87e4-e7259266e3f5/volumes" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.867309 4904 scope.go:117] "RemoveContainer" containerID="783c3f1bf42a6327462ca6733f147cec1efeb6a21ff8ed57ad826867c1021406" Dec 05 21:46:09 crc kubenswrapper[4904]: I1205 21:46:09.867388 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-gf2lx" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.229086 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pspg6/crc-debug-2qdp7"] Dec 05 21:46:10 crc kubenswrapper[4904]: E1205 21:46:10.229532 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7929ee32-233e-4989-87e4-e7259266e3f5" containerName="container-00" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.229544 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="7929ee32-233e-4989-87e4-e7259266e3f5" containerName="container-00" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.229768 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="7929ee32-233e-4989-87e4-e7259266e3f5" containerName="container-00" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.230504 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.247566 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9db52630-7e7c-4b1c-a930-2761213f2a61-host\") pod \"crc-debug-2qdp7\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.247620 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzpvc\" (UniqueName: \"kubernetes.io/projected/9db52630-7e7c-4b1c-a930-2761213f2a61-kube-api-access-hzpvc\") pod \"crc-debug-2qdp7\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.348815 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9db52630-7e7c-4b1c-a930-2761213f2a61-host\") pod \"crc-debug-2qdp7\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.348870 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzpvc\" (UniqueName: \"kubernetes.io/projected/9db52630-7e7c-4b1c-a930-2761213f2a61-kube-api-access-hzpvc\") pod \"crc-debug-2qdp7\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.348965 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9db52630-7e7c-4b1c-a930-2761213f2a61-host\") pod \"crc-debug-2qdp7\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.372632 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzpvc\" (UniqueName: \"kubernetes.io/projected/9db52630-7e7c-4b1c-a930-2761213f2a61-kube-api-access-hzpvc\") pod \"crc-debug-2qdp7\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.545834 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.885693 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" event={"ID":"9db52630-7e7c-4b1c-a930-2761213f2a61","Type":"ContainerStarted","Data":"ff656299509e3067fc5f945c84a87a33c4c2f12c0b3019acac541229f9e6d72c"} Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.885962 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" event={"ID":"9db52630-7e7c-4b1c-a930-2761213f2a61","Type":"ContainerStarted","Data":"731ce1dea66f59724b93a9a0bfe450930b5808bca2b05af525092d4e52d3eb00"} Dec 05 21:46:10 crc kubenswrapper[4904]: I1205 21:46:10.921309 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" podStartSLOduration=0.921258768 podStartE2EDuration="921.258768ms" podCreationTimestamp="2025-12-05 21:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:46:10.906476596 +0000 UTC m=+5669.717692705" watchObservedRunningTime="2025-12-05 21:46:10.921258768 +0000 UTC m=+5669.732474887" Dec 05 21:46:11 crc kubenswrapper[4904]: I1205 21:46:11.899194 4904 generic.go:334] "Generic (PLEG): container finished" podID="9db52630-7e7c-4b1c-a930-2761213f2a61" containerID="ff656299509e3067fc5f945c84a87a33c4c2f12c0b3019acac541229f9e6d72c" exitCode=0 Dec 05 21:46:11 crc kubenswrapper[4904]: I1205 21:46:11.899237 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" event={"ID":"9db52630-7e7c-4b1c-a930-2761213f2a61","Type":"ContainerDied","Data":"ff656299509e3067fc5f945c84a87a33c4c2f12c0b3019acac541229f9e6d72c"} Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.001227 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.112438 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzpvc\" (UniqueName: \"kubernetes.io/projected/9db52630-7e7c-4b1c-a930-2761213f2a61-kube-api-access-hzpvc\") pod \"9db52630-7e7c-4b1c-a930-2761213f2a61\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.112547 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9db52630-7e7c-4b1c-a930-2761213f2a61-host\") pod \"9db52630-7e7c-4b1c-a930-2761213f2a61\" (UID: \"9db52630-7e7c-4b1c-a930-2761213f2a61\") " Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.113380 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9db52630-7e7c-4b1c-a930-2761213f2a61-host" (OuterVolumeSpecName: "host") pod "9db52630-7e7c-4b1c-a930-2761213f2a61" (UID: "9db52630-7e7c-4b1c-a930-2761213f2a61"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.133912 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db52630-7e7c-4b1c-a930-2761213f2a61-kube-api-access-hzpvc" (OuterVolumeSpecName: "kube-api-access-hzpvc") pod "9db52630-7e7c-4b1c-a930-2761213f2a61" (UID: "9db52630-7e7c-4b1c-a930-2761213f2a61"). InnerVolumeSpecName "kube-api-access-hzpvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.214936 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzpvc\" (UniqueName: \"kubernetes.io/projected/9db52630-7e7c-4b1c-a930-2761213f2a61-kube-api-access-hzpvc\") on node \"crc\" DevicePath \"\"" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.214992 4904 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9db52630-7e7c-4b1c-a930-2761213f2a61-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.386126 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pspg6/crc-debug-2qdp7"] Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.395355 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pspg6/crc-debug-2qdp7"] Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.697892 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db52630-7e7c-4b1c-a930-2761213f2a61" path="/var/lib/kubelet/pods/9db52630-7e7c-4b1c-a930-2761213f2a61/volumes" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.916737 4904 scope.go:117] "RemoveContainer" containerID="ff656299509e3067fc5f945c84a87a33c4c2f12c0b3019acac541229f9e6d72c" Dec 05 21:46:13 crc kubenswrapper[4904]: I1205 21:46:13.916790 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-2qdp7" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.578147 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pspg6/crc-debug-lgtdv"] Dec 05 21:46:14 crc kubenswrapper[4904]: E1205 21:46:14.578700 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db52630-7e7c-4b1c-a930-2761213f2a61" containerName="container-00" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.578715 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db52630-7e7c-4b1c-a930-2761213f2a61" containerName="container-00" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.579002 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db52630-7e7c-4b1c-a930-2761213f2a61" containerName="container-00" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.579686 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.642112 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2d70883-224c-4420-bcac-d8fbb4888182-host\") pod \"crc-debug-lgtdv\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.642250 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b25c\" (UniqueName: \"kubernetes.io/projected/d2d70883-224c-4420-bcac-d8fbb4888182-kube-api-access-9b25c\") pod \"crc-debug-lgtdv\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.743591 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2d70883-224c-4420-bcac-d8fbb4888182-host\") pod \"crc-debug-lgtdv\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.743804 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b25c\" (UniqueName: \"kubernetes.io/projected/d2d70883-224c-4420-bcac-d8fbb4888182-kube-api-access-9b25c\") pod \"crc-debug-lgtdv\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.744610 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2d70883-224c-4420-bcac-d8fbb4888182-host\") pod \"crc-debug-lgtdv\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.780925 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b25c\" (UniqueName: \"kubernetes.io/projected/d2d70883-224c-4420-bcac-d8fbb4888182-kube-api-access-9b25c\") pod \"crc-debug-lgtdv\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: I1205 21:46:14.896044 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:14 crc kubenswrapper[4904]: W1205 21:46:14.940457 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2d70883_224c_4420_bcac_d8fbb4888182.slice/crio-97ac14b28828d576aaa4d17e77379acee28dc9f9566cdc3e30c1cda866999ee0 WatchSource:0}: Error finding container 97ac14b28828d576aaa4d17e77379acee28dc9f9566cdc3e30c1cda866999ee0: Status 404 returned error can't find the container with id 97ac14b28828d576aaa4d17e77379acee28dc9f9566cdc3e30c1cda866999ee0 Dec 05 21:46:15 crc kubenswrapper[4904]: I1205 21:46:15.969925 4904 generic.go:334] "Generic (PLEG): container finished" podID="d2d70883-224c-4420-bcac-d8fbb4888182" containerID="e02a7b562e55101760605adb1dffb334e3ac50af5e1ce32487dc087da4198e0e" exitCode=0 Dec 05 21:46:15 crc kubenswrapper[4904]: I1205 21:46:15.970015 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-lgtdv" event={"ID":"d2d70883-224c-4420-bcac-d8fbb4888182","Type":"ContainerDied","Data":"e02a7b562e55101760605adb1dffb334e3ac50af5e1ce32487dc087da4198e0e"} Dec 05 21:46:15 crc kubenswrapper[4904]: I1205 21:46:15.971490 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/crc-debug-lgtdv" event={"ID":"d2d70883-224c-4420-bcac-d8fbb4888182","Type":"ContainerStarted","Data":"97ac14b28828d576aaa4d17e77379acee28dc9f9566cdc3e30c1cda866999ee0"} Dec 05 21:46:16 crc kubenswrapper[4904]: I1205 21:46:16.025029 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pspg6/crc-debug-lgtdv"] Dec 05 21:46:16 crc kubenswrapper[4904]: I1205 21:46:16.036495 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pspg6/crc-debug-lgtdv"] Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.102675 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.196660 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2d70883-224c-4420-bcac-d8fbb4888182-host\") pod \"d2d70883-224c-4420-bcac-d8fbb4888182\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.196773 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b25c\" (UniqueName: \"kubernetes.io/projected/d2d70883-224c-4420-bcac-d8fbb4888182-kube-api-access-9b25c\") pod \"d2d70883-224c-4420-bcac-d8fbb4888182\" (UID: \"d2d70883-224c-4420-bcac-d8fbb4888182\") " Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.196781 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2d70883-224c-4420-bcac-d8fbb4888182-host" (OuterVolumeSpecName: "host") pod "d2d70883-224c-4420-bcac-d8fbb4888182" (UID: "d2d70883-224c-4420-bcac-d8fbb4888182"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.197256 4904 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2d70883-224c-4420-bcac-d8fbb4888182-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.203510 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d70883-224c-4420-bcac-d8fbb4888182-kube-api-access-9b25c" (OuterVolumeSpecName: "kube-api-access-9b25c") pod "d2d70883-224c-4420-bcac-d8fbb4888182" (UID: "d2d70883-224c-4420-bcac-d8fbb4888182"). InnerVolumeSpecName "kube-api-access-9b25c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.299252 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b25c\" (UniqueName: \"kubernetes.io/projected/d2d70883-224c-4420-bcac-d8fbb4888182-kube-api-access-9b25c\") on node \"crc\" DevicePath \"\"" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.692327 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d70883-224c-4420-bcac-d8fbb4888182" path="/var/lib/kubelet/pods/d2d70883-224c-4420-bcac-d8fbb4888182/volumes" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.993073 4904 scope.go:117] "RemoveContainer" containerID="e02a7b562e55101760605adb1dffb334e3ac50af5e1ce32487dc087da4198e0e" Dec 05 21:46:17 crc kubenswrapper[4904]: I1205 21:46:17.993183 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/crc-debug-lgtdv" Dec 05 21:46:41 crc kubenswrapper[4904]: I1205 21:46:41.743981 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66d5cfdcdd-d59d6_f786b88c-ea37-4a02-bdd1-1f9feca9993a/barbican-api/0.log" Dec 05 21:46:41 crc kubenswrapper[4904]: I1205 21:46:41.869568 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66d5cfdcdd-d59d6_f786b88c-ea37-4a02-bdd1-1f9feca9993a/barbican-api-log/0.log" Dec 05 21:46:41 crc kubenswrapper[4904]: I1205 21:46:41.955476 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8fb8b568-c7bl2_83ab1170-b41e-4a13-b227-8b67b86587cc/barbican-keystone-listener/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.106963 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8fb8b568-c7bl2_83ab1170-b41e-4a13-b227-8b67b86587cc/barbican-keystone-listener-log/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.176137 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d8548b4f5-cmq7k_7af9740e-f05b-4d6d-9075-b7038018de84/barbican-worker/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.222374 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d8548b4f5-cmq7k_7af9740e-f05b-4d6d-9075-b7038018de84/barbican-worker-log/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.427319 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8_2277bc86-3475-44fd-a77d-9a2f552bb457/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.764353 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/ceilometer-central-agent/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.793592 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/proxy-httpd/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.851354 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/sg-core/0.log" Dec 05 21:46:42 crc kubenswrapper[4904]: I1205 21:46:42.909773 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/ceilometer-notification-agent/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.164428 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dadf2169-1b54-4e50-adff-71504b526259/cinder-api-log/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.393935 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dadf2169-1b54-4e50-adff-71504b526259/cinder-api/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.469988 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cdca5ac3-4ef8-43e3-8244-438e43e029c4/probe/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.587792 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cdca5ac3-4ef8-43e3-8244-438e43e029c4/cinder-backup/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.688694 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60174921-7963-4d63-83e7-8b702d3e9dd2/cinder-scheduler/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.767775 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60174921-7963-4d63-83e7-8b702d3e9dd2/probe/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.981673 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b2237f58-91be-4eae-9feb-94feacffd4a6/cinder-volume/0.log" Dec 05 21:46:43 crc kubenswrapper[4904]: I1205 21:46:43.983792 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b2237f58-91be-4eae-9feb-94feacffd4a6/probe/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.303663 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_b3d4e158-45e3-4448-ad27-36e4aa3cb002/probe/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.308889 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_b3d4e158-45e3-4448-ad27-36e4aa3cb002/cinder-volume/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.436827 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx_676e3b5b-34d1-47bc-a1db-3bb15a83282b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.520558 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb_d1b27fa3-93b0-4c28-9158-8de58adc4799/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.645699 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b7c6b4c7-wk76m_0aa075ff-9799-456e-b08e-5146d8e11c06/init/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.902437 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4_2b8cd007-58c8-4fd2-924d-a8d24608ff6c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:44 crc kubenswrapper[4904]: I1205 21:46:44.925998 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b7c6b4c7-wk76m_0aa075ff-9799-456e-b08e-5146d8e11c06/init/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.054381 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b7c6b4c7-wk76m_0aa075ff-9799-456e-b08e-5146d8e11c06/dnsmasq-dns/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.141514 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e/glance-log/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.182946 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e/glance-httpd/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.380087 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1b2cda4-e597-4027-b9b1-cf52ec98dcb8/glance-log/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.388323 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1b2cda4-e597-4027-b9b1-cf52ec98dcb8/glance-httpd/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.705019 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d867d46cb-9mdx2_ffe3f1f4-8d49-4bf8-a088-e3a930ddc614/horizon/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.778734 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-28gf8_9edc6269-cd10-4724-9cf3-9b65c80ab8d9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:45 crc kubenswrapper[4904]: I1205 21:46:45.988004 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ww8dd_1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:46 crc kubenswrapper[4904]: I1205 21:46:46.226340 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416141-5f2p7_c392ddc8-695b-4543-a7f4-05ad75ff272b/keystone-cron/0.log" Dec 05 21:46:46 crc kubenswrapper[4904]: I1205 21:46:46.379211 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cd4e564c-e066-405d-92e5-1d312bfd1f57/kube-state-metrics/0.log" Dec 05 21:46:46 crc kubenswrapper[4904]: I1205 21:46:46.424542 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d867d46cb-9mdx2_ffe3f1f4-8d49-4bf8-a088-e3a930ddc614/horizon-log/0.log" Dec 05 21:46:46 crc kubenswrapper[4904]: I1205 21:46:46.621256 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-64fdbdb744-gdc8l_2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a/keystone-api/0.log" Dec 05 21:46:46 crc kubenswrapper[4904]: I1205 21:46:46.700525 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5548g_11479a0b-d4c6-4770-bdf1-dbb8a417384d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:47 crc kubenswrapper[4904]: I1205 21:46:47.119014 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78cf6bb7c7-zmmjl_9f1be3e8-cbd0-45cb-acb6-cf56217cec07/neutron-httpd/0.log" Dec 05 21:46:47 crc kubenswrapper[4904]: I1205 21:46:47.230578 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78cf6bb7c7-zmmjl_9f1be3e8-cbd0-45cb-acb6-cf56217cec07/neutron-api/0.log" Dec 05 21:46:47 crc kubenswrapper[4904]: I1205 21:46:47.348661 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr_aee65db4-60e1-4a83-80d0-81e90f6f4f07/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:47 crc kubenswrapper[4904]: I1205 21:46:47.951715 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8538bb66-3a1f-40a2-bf64-b04b49318d34/nova-cell0-conductor-conductor/0.log" Dec 05 21:46:48 crc kubenswrapper[4904]: I1205 21:46:48.344089 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a3f6c470-722e-4a05-a450-e65791498b79/nova-cell1-conductor-conductor/0.log" Dec 05 21:46:48 crc kubenswrapper[4904]: I1205 21:46:48.515314 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8da8babf-f15a-4f70-bf70-23218ca0628a/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 21:46:48 crc kubenswrapper[4904]: I1205 21:46:48.811193 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-6jljb_d9ee9825-e991-495c-bbd8-30ae1e7b0780/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:48 crc kubenswrapper[4904]: I1205 21:46:48.985984 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0/nova-api-log/0.log" Dec 05 21:46:49 crc kubenswrapper[4904]: I1205 21:46:49.183612 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eeb7ce1d-7b77-4a2f-9e87-99569f95995d/nova-metadata-log/0.log" Dec 05 21:46:49 crc kubenswrapper[4904]: I1205 21:46:49.378131 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0/nova-api-api/0.log" Dec 05 21:46:49 crc kubenswrapper[4904]: I1205 21:46:49.710927 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5ce87dae-66a4-4d60-bd13-3ac6a44abeed/nova-scheduler-scheduler/0.log" Dec 05 21:46:50 crc kubenswrapper[4904]: I1205 21:46:50.193740 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3437a493-1ffe-49dc-a789-3451b2f87204/mysql-bootstrap/0.log" Dec 05 21:46:50 crc kubenswrapper[4904]: I1205 21:46:50.193947 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3437a493-1ffe-49dc-a789-3451b2f87204/mysql-bootstrap/0.log" Dec 05 21:46:50 crc kubenswrapper[4904]: I1205 21:46:50.428721 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e08506d4-1ca7-4932-b73f-21020cb20578/mysql-bootstrap/0.log" Dec 05 21:46:50 crc kubenswrapper[4904]: I1205 21:46:50.488484 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3437a493-1ffe-49dc-a789-3451b2f87204/galera/0.log" Dec 05 21:46:50 crc kubenswrapper[4904]: I1205 21:46:50.785262 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e08506d4-1ca7-4932-b73f-21020cb20578/mysql-bootstrap/0.log" Dec 05 21:46:50 crc kubenswrapper[4904]: I1205 21:46:50.808716 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e08506d4-1ca7-4932-b73f-21020cb20578/galera/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.079299 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c8a50daf-6f3c-405d-b047-12a11ac0b56b/openstackclient/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.083930 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hlxgq_41ef8df6-e0e1-45a5-954d-10ce99fa26de/ovn-controller/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.520422 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-t974s_e7529566-97c3-42bf-a66c-1186aec23176/openstack-network-exporter/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.716100 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovsdb-server-init/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.851571 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovsdb-server-init/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.968479 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovsdb-server/0.log" Dec 05 21:46:51 crc kubenswrapper[4904]: I1205 21:46:51.988430 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eeb7ce1d-7b77-4a2f-9e87-99569f95995d/nova-metadata-metadata/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.259094 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4qvrt_e05ac5cc-a4c0-46c3-9beb-3f607156b962/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.266880 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovs-vswitchd/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.347000 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c15a10eb-9132-4ec6-8861-4c2320962cc3/openstack-network-exporter/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.441856 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c15a10eb-9132-4ec6-8861-4c2320962cc3/ovn-northd/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.519545 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0/openstack-network-exporter/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.546667 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0/ovsdbserver-nb/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.782135 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c9f105e3-0b5a-435f-bc00-fcfd7eceaafd/openstack-network-exporter/0.log" Dec 05 21:46:52 crc kubenswrapper[4904]: I1205 21:46:52.873259 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c9f105e3-0b5a-435f-bc00-fcfd7eceaafd/ovsdbserver-sb/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.093283 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/init-config-reloader/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.286898 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c6d7bc49b-2f688_4879cccb-c711-458b-8ae5-895ec70f6536/placement-api/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.303224 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/init-config-reloader/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.387251 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c6d7bc49b-2f688_4879cccb-c711-458b-8ae5-895ec70f6536/placement-log/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.407011 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/config-reloader/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.495250 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/thanos-sidecar/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.584710 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/prometheus/0.log" Dec 05 21:46:53 crc kubenswrapper[4904]: I1205 21:46:53.783920 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_69046049-50b5-4ced-8afa-5ef3405aad24/setup-container/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.000273 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_69046049-50b5-4ced-8afa-5ef3405aad24/rabbitmq/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.059179 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_69046049-50b5-4ced-8afa-5ef3405aad24/setup-container/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.067120 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_0caaad94-d02e-43da-bf3b-087a5ec8d2f8/setup-container/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.232774 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_0caaad94-d02e-43da-bf3b-087a5ec8d2f8/setup-container/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.319097 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1e45678d-877d-4c34-a8f5-913d31a8b79d/setup-container/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.333527 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_0caaad94-d02e-43da-bf3b-087a5ec8d2f8/rabbitmq/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.631507 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1e45678d-877d-4c34-a8f5-913d31a8b79d/rabbitmq/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.638908 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1e45678d-877d-4c34-a8f5-913d31a8b79d/setup-container/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.835863 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w_222edc46-4ddd-4236-8635-45b365513214/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:54 crc kubenswrapper[4904]: I1205 21:46:54.867034 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tc59d_cc55dc7a-28f4-47eb-8c57-58e949a98dcc/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:55 crc kubenswrapper[4904]: I1205 21:46:55.107910 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8_c12b173c-79c1-4bcf-a76a-b3bc84b9b556/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:55 crc kubenswrapper[4904]: I1205 21:46:55.188711 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7dbn4_88dfb504-1f6c-41bc-860e-84eeb0a7fff9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:55 crc kubenswrapper[4904]: I1205 21:46:55.630577 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-g44s6_78aea92b-9deb-44c8-b5ac-f9224b038591/ssh-known-hosts-edpm-deployment/0.log" Dec 05 21:46:55 crc kubenswrapper[4904]: I1205 21:46:55.824778 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc85d8c5-xr857_26484d4c-3765-4214-81e6-af49ebfde502/proxy-httpd/0.log" Dec 05 21:46:55 crc kubenswrapper[4904]: I1205 21:46:55.921167 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-9xq46_4b80a797-4212-4242-81fd-928045b629cd/swift-ring-rebalance/0.log" Dec 05 21:46:55 crc kubenswrapper[4904]: I1205 21:46:55.980076 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc85d8c5-xr857_26484d4c-3765-4214-81e6-af49ebfde502/proxy-server/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.119530 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-auditor/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.210305 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-reaper/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.275791 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-replicator/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.403922 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-auditor/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.413982 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-server/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.485172 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-replicator/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.553646 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-server/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.612516 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-updater/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.719847 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-auditor/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.794664 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-expirer/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.818116 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-replicator/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.877195 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-server/0.log" Dec 05 21:46:56 crc kubenswrapper[4904]: I1205 21:46:56.940296 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-updater/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.080817 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/swift-recon-cron/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.089233 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/rsync/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.208537 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k_adfcfa36-4dfd-422c-b1a5-3a2e342ea208/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.368409 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_45f624ec-9d5e-41f1-ba5b-e81c2b84c532/tempest-tests-tempest-tests-runner/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.571619 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_31c70021-e6df-4ac6-b79e-402f24a13112/test-operator-logs-container/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.575376 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_980ec67d-9dc9-4cae-8169-9890a40d65c3/memcached/0.log" Dec 05 21:46:57 crc kubenswrapper[4904]: I1205 21:46:57.655618 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq_192437e7-c4fd-4142-94fb-e3f2a9c75841/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:46:58 crc kubenswrapper[4904]: I1205 21:46:58.316897 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_48f600c5-7f88-4c27-a152-d79354212532/watcher-applier/0.log" Dec 05 21:46:59 crc kubenswrapper[4904]: I1205 21:46:59.029707 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_f7be275a-3638-4279-a90b-8ad43e931ee6/watcher-api-log/0.log" Dec 05 21:47:01 crc kubenswrapper[4904]: I1205 21:47:01.157323 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_e610af6c-e57f-4676-95ec-b8accd64aea9/watcher-decision-engine/0.log" Dec 05 21:47:01 crc kubenswrapper[4904]: I1205 21:47:01.998924 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_f7be275a-3638-4279-a90b-8ad43e931ee6/watcher-api/0.log" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.630336 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n7glz"] Dec 05 21:47:06 crc kubenswrapper[4904]: E1205 21:47:06.631264 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d70883-224c-4420-bcac-d8fbb4888182" containerName="container-00" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.631278 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d70883-224c-4420-bcac-d8fbb4888182" containerName="container-00" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.631478 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d70883-224c-4420-bcac-d8fbb4888182" containerName="container-00" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.632792 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.648259 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7glz"] Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.780210 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-catalog-content\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.780342 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7sn\" (UniqueName: \"kubernetes.io/projected/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-kube-api-access-zl7sn\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.780492 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-utilities\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.882132 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7sn\" (UniqueName: \"kubernetes.io/projected/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-kube-api-access-zl7sn\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.882324 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-utilities\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.882414 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-catalog-content\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.882812 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-utilities\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.882919 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-catalog-content\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.906033 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7sn\" (UniqueName: \"kubernetes.io/projected/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-kube-api-access-zl7sn\") pod \"certified-operators-n7glz\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:06 crc kubenswrapper[4904]: I1205 21:47:06.958837 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:07 crc kubenswrapper[4904]: I1205 21:47:07.482638 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7glz"] Dec 05 21:47:08 crc kubenswrapper[4904]: I1205 21:47:08.488586 4904 generic.go:334] "Generic (PLEG): container finished" podID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerID="8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887" exitCode=0 Dec 05 21:47:08 crc kubenswrapper[4904]: I1205 21:47:08.488778 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerDied","Data":"8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887"} Dec 05 21:47:08 crc kubenswrapper[4904]: I1205 21:47:08.488916 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerStarted","Data":"e6188d478c267206b4fc455c6aeea8354eeb154692bc0d5bc4caa15392e4b04d"} Dec 05 21:47:10 crc kubenswrapper[4904]: I1205 21:47:10.510383 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerStarted","Data":"26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5"} Dec 05 21:47:11 crc kubenswrapper[4904]: I1205 21:47:11.522735 4904 generic.go:334] "Generic (PLEG): container finished" podID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerID="26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5" exitCode=0 Dec 05 21:47:11 crc kubenswrapper[4904]: I1205 21:47:11.522792 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerDied","Data":"26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5"} Dec 05 21:47:13 crc kubenswrapper[4904]: I1205 21:47:13.565752 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerStarted","Data":"4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84"} Dec 05 21:47:13 crc kubenswrapper[4904]: I1205 21:47:13.604306 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n7glz" podStartSLOduration=3.70051685 podStartE2EDuration="7.603034343s" podCreationTimestamp="2025-12-05 21:47:06 +0000 UTC" firstStartedPulling="2025-12-05 21:47:08.490676186 +0000 UTC m=+5727.301892295" lastFinishedPulling="2025-12-05 21:47:12.393193679 +0000 UTC m=+5731.204409788" observedRunningTime="2025-12-05 21:47:13.598205982 +0000 UTC m=+5732.409422111" watchObservedRunningTime="2025-12-05 21:47:13.603034343 +0000 UTC m=+5732.414250442" Dec 05 21:47:16 crc kubenswrapper[4904]: I1205 21:47:16.959847 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:16 crc kubenswrapper[4904]: I1205 21:47:16.960489 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:17 crc kubenswrapper[4904]: I1205 21:47:17.022296 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:17 crc kubenswrapper[4904]: I1205 21:47:17.644777 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:17 crc kubenswrapper[4904]: I1205 21:47:17.697103 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7glz"] Dec 05 21:47:19 crc kubenswrapper[4904]: I1205 21:47:19.615341 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n7glz" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="registry-server" containerID="cri-o://4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84" gracePeriod=2 Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.101254 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.149425 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-catalog-content\") pod \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.149499 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl7sn\" (UniqueName: \"kubernetes.io/projected/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-kube-api-access-zl7sn\") pod \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.166588 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-kube-api-access-zl7sn" (OuterVolumeSpecName: "kube-api-access-zl7sn") pod "6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" (UID: "6a6e9ad1-9a61-4a2a-b103-a160d20a92d1"). InnerVolumeSpecName "kube-api-access-zl7sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.207586 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" (UID: "6a6e9ad1-9a61-4a2a-b103-a160d20a92d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.251922 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-utilities\") pod \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\" (UID: \"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1\") " Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.252887 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.252975 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl7sn\" (UniqueName: \"kubernetes.io/projected/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-kube-api-access-zl7sn\") on node \"crc\" DevicePath \"\"" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.252895 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-utilities" (OuterVolumeSpecName: "utilities") pod "6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" (UID: "6a6e9ad1-9a61-4a2a-b103-a160d20a92d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.355340 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.627078 4904 generic.go:334] "Generic (PLEG): container finished" podID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerID="4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84" exitCode=0 Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.627133 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7glz" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.627125 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerDied","Data":"4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84"} Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.627185 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7glz" event={"ID":"6a6e9ad1-9a61-4a2a-b103-a160d20a92d1","Type":"ContainerDied","Data":"e6188d478c267206b4fc455c6aeea8354eeb154692bc0d5bc4caa15392e4b04d"} Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.627206 4904 scope.go:117] "RemoveContainer" containerID="4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.660569 4904 scope.go:117] "RemoveContainer" containerID="26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.671020 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7glz"] Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.683136 4904 scope.go:117] "RemoveContainer" containerID="8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.687301 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n7glz"] Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.736875 4904 scope.go:117] "RemoveContainer" containerID="4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84" Dec 05 21:47:20 crc kubenswrapper[4904]: E1205 21:47:20.738292 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84\": container with ID starting with 4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84 not found: ID does not exist" containerID="4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.738334 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84"} err="failed to get container status \"4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84\": rpc error: code = NotFound desc = could not find container \"4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84\": container with ID starting with 4be5153d53c0a3563423e63353170de6cba4e1bb4e47f990b1804acd9e1c0c84 not found: ID does not exist" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.738359 4904 scope.go:117] "RemoveContainer" containerID="26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5" Dec 05 21:47:20 crc kubenswrapper[4904]: E1205 21:47:20.738926 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5\": container with ID starting with 26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5 not found: ID does not exist" containerID="26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.738959 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5"} err="failed to get container status \"26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5\": rpc error: code = NotFound desc = could not find container \"26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5\": container with ID starting with 26d80a6644d1c55368d4829cb07c80ce1a95dbace0f7b3e1d67c3d2c20920ef5 not found: ID does not exist" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.738978 4904 scope.go:117] "RemoveContainer" containerID="8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887" Dec 05 21:47:20 crc kubenswrapper[4904]: E1205 21:47:20.739289 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887\": container with ID starting with 8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887 not found: ID does not exist" containerID="8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887" Dec 05 21:47:20 crc kubenswrapper[4904]: I1205 21:47:20.739319 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887"} err="failed to get container status \"8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887\": rpc error: code = NotFound desc = could not find container \"8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887\": container with ID starting with 8db10414d5854c10dc28d7fa251cc649118ac15046d044f40b850edce8f34887 not found: ID does not exist" Dec 05 21:47:21 crc kubenswrapper[4904]: I1205 21:47:21.693817 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" path="/var/lib/kubelet/pods/6a6e9ad1-9a61-4a2a-b103-a160d20a92d1/volumes" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.269865 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jrrnh_16525ebe-6de2-4974-a4ee-ed99a0e4ea1f/kube-rbac-proxy/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.426437 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jrrnh_16525ebe-6de2-4974-a4ee-ed99a0e4ea1f/manager/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.524937 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/util/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.727232 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/util/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.739826 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/pull/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.754656 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/pull/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.896307 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/pull/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.932433 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/util/0.log" Dec 05 21:47:27 crc kubenswrapper[4904]: I1205 21:47:27.934052 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/extract/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.097711 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xd4c5_0fefb560-28a2-4316-9448-8361111d4837/kube-rbac-proxy/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.229859 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xd4c5_0fefb560-28a2-4316-9448-8361111d4837/manager/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.242745 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7f864_8ee93c89-5d32-4114-b134-c084359d11ec/kube-rbac-proxy/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.331670 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7f864_8ee93c89-5d32-4114-b134-c084359d11ec/manager/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.453364 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-z4chx_799e9803-03c5-4406-8b50-e59dedc0918d/kube-rbac-proxy/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.525624 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-z4chx_799e9803-03c5-4406-8b50-e59dedc0918d/manager/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.672000 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lvn6q_17322e61-a1f2-4228-8784-ea6869288aaa/kube-rbac-proxy/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.745926 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lvn6q_17322e61-a1f2-4228-8784-ea6869288aaa/manager/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.827527 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lc48g_7ae1ca6f-894c-4c12-ab0a-459e97fa442e/kube-rbac-proxy/0.log" Dec 05 21:47:28 crc kubenswrapper[4904]: I1205 21:47:28.928824 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lc48g_7ae1ca6f-894c-4c12-ab0a-459e97fa442e/manager/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.027977 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4zb86_4453c00a-291d-4edb-ab56-9b0fdf3b1ea5/kube-rbac-proxy/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.216604 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-9zhph_60eb443e-807c-41f8-8935-7bfacc9dc89b/kube-rbac-proxy/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.274654 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4zb86_4453c00a-291d-4edb-ab56-9b0fdf3b1ea5/manager/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.494947 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-9zhph_60eb443e-807c-41f8-8935-7bfacc9dc89b/manager/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.623631 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-h9wsr_a2b7747b-01b7-4f12-8748-2661f53078f0/kube-rbac-proxy/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.710958 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-h9wsr_a2b7747b-01b7-4f12-8748-2661f53078f0/manager/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.832579 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xt4ph_e7d93aaa-3c37-4793-8535-6dcce7bb79b0/kube-rbac-proxy/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.859931 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xt4ph_e7d93aaa-3c37-4793-8535-6dcce7bb79b0/manager/0.log" Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.955503 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:47:29 crc kubenswrapper[4904]: I1205 21:47:29.955776 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.007326 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-wq6hj_8c31d222-d349-476a-9aa2-cc57ec51d926/kube-rbac-proxy/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.115042 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-wq6hj_8c31d222-d349-476a-9aa2-cc57ec51d926/manager/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.204679 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-bzms8_7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1/kube-rbac-proxy/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.291872 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-bzms8_7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1/manager/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.388417 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xv5gb_2035ff23-1caf-4e9e-bc39-46caef0eb07d/kube-rbac-proxy/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.468299 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xv5gb_2035ff23-1caf-4e9e-bc39-46caef0eb07d/manager/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.591649 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l65z8_03744ad0-4115-4d2f-bf2d-5acc45a6d05a/kube-rbac-proxy/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.641943 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l65z8_03744ad0-4115-4d2f-bf2d-5acc45a6d05a/manager/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.819989 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9_50c0879e-d30b-4a8a-972b-10f3188ff06a/manager/0.log" Dec 05 21:47:30 crc kubenswrapper[4904]: I1205 21:47:30.825790 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9_50c0879e-d30b-4a8a-972b-10f3188ff06a/kube-rbac-proxy/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.290023 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kc5m7_6b8dcb6d-00ed-4247-b4c8-5e5964bc0513/registry-server/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.328680 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56699b584c-242nn_3c454cc5-18c2-420a-ac01-657bedda4fa7/operator/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.643962 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-f2bbb_116b1af4-b71b-46ab-9977-12342c13594e/kube-rbac-proxy/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.664826 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-f2bbb_116b1af4-b71b-46ab-9977-12342c13594e/manager/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.731102 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gv6jv_53b1878e-6de9-4961-9cf6-4673c09c0412/kube-rbac-proxy/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.937056 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4ddwg_51c461fe-c535-4a35-8409-f319c4549ecf/operator/0.log" Dec 05 21:47:31 crc kubenswrapper[4904]: I1205 21:47:31.938803 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gv6jv_53b1878e-6de9-4961-9cf6-4673c09c0412/manager/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.222003 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-xqdzf_2f0e1e69-f795-4166-b6bd-946050c1524e/manager/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.232506 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-xqdzf_2f0e1e69-f795-4166-b6bd-946050c1524e/kube-rbac-proxy/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.232716 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-n6d4m_086dd7ac-a5cc-4433-a5a6-0cfd88a69d72/kube-rbac-proxy/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.253701 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dc867b75-jxjt2_0964d0c8-bed0-4c26-969d-c8e895793312/manager/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.458226 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-kr7mw_54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9/kube-rbac-proxy/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.466514 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-kr7mw_54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9/manager/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.589629 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-n6d4m_086dd7ac-a5cc-4433-a5a6-0cfd88a69d72/manager/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.656020 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f6cb9b975-pjwn4_e0155f33-dd5a-4c6f-b261-2b7026149e9c/kube-rbac-proxy/0.log" Dec 05 21:47:32 crc kubenswrapper[4904]: I1205 21:47:32.762966 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f6cb9b975-pjwn4_e0155f33-dd5a-4c6f-b261-2b7026149e9c/manager/0.log" Dec 05 21:47:49 crc kubenswrapper[4904]: I1205 21:47:49.469033 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-ptsz7" podUID="fe343c2a-87f6-45d4-a91d-3f86b9b5029b" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 21:47:54 crc kubenswrapper[4904]: I1205 21:47:54.796188 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-888hj_efb530a0-f68c-4664-81bf-32871d3b8259/control-plane-machine-set-operator/0.log" Dec 05 21:47:54 crc kubenswrapper[4904]: I1205 21:47:54.954381 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l928g_11ca5edb-7664-4e63-a9e8-46f270623ad2/kube-rbac-proxy/0.log" Dec 05 21:47:55 crc kubenswrapper[4904]: I1205 21:47:55.039188 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l928g_11ca5edb-7664-4e63-a9e8-46f270623ad2/machine-api-operator/0.log" Dec 05 21:47:59 crc kubenswrapper[4904]: I1205 21:47:59.955579 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:47:59 crc kubenswrapper[4904]: I1205 21:47:59.956100 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:48:08 crc kubenswrapper[4904]: I1205 21:48:08.527109 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-5gj92_c33ef1b8-c426-4845-ace3-476e7e7c842e/cert-manager-controller/0.log" Dec 05 21:48:08 crc kubenswrapper[4904]: I1205 21:48:08.682648 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-tfxc5_45ace5ad-d0b7-469f-b03c-62e935ba67dd/cert-manager-cainjector/0.log" Dec 05 21:48:08 crc kubenswrapper[4904]: I1205 21:48:08.779486 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2d8zg_cdc9c672-2a99-47b6-8457-fd6ec79db49b/cert-manager-webhook/0.log" Dec 05 21:48:21 crc kubenswrapper[4904]: I1205 21:48:21.429291 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-lqpqd_ffe1cf37-52b4-4493-bf5b-f0318a5015a9/nmstate-console-plugin/0.log" Dec 05 21:48:21 crc kubenswrapper[4904]: I1205 21:48:21.646912 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8qf2w_b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903/nmstate-handler/0.log" Dec 05 21:48:21 crc kubenswrapper[4904]: I1205 21:48:21.756515 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-94lfj_b05e6bce-1fed-411b-9c7d-ea32260cb8dc/nmstate-metrics/0.log" Dec 05 21:48:21 crc kubenswrapper[4904]: I1205 21:48:21.802384 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-94lfj_b05e6bce-1fed-411b-9c7d-ea32260cb8dc/kube-rbac-proxy/0.log" Dec 05 21:48:21 crc kubenswrapper[4904]: I1205 21:48:21.988322 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-xg947_a66ea824-d482-41b2-8ddc-5ee70d24db5a/nmstate-operator/0.log" Dec 05 21:48:22 crc kubenswrapper[4904]: I1205 21:48:22.084032 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7dl4s_1f407364-e4d6-4506-abaa-f4e3ae5ab29f/nmstate-webhook/0.log" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.630724 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-669j5"] Dec 05 21:48:27 crc kubenswrapper[4904]: E1205 21:48:27.631846 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="extract-utilities" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.631860 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="extract-utilities" Dec 05 21:48:27 crc kubenswrapper[4904]: E1205 21:48:27.631875 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="extract-content" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.631880 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="extract-content" Dec 05 21:48:27 crc kubenswrapper[4904]: E1205 21:48:27.631894 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="registry-server" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.631900 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="registry-server" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.632128 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6e9ad1-9a61-4a2a-b103-a160d20a92d1" containerName="registry-server" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.633835 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.649772 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-669j5"] Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.800678 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-catalog-content\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.800918 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vh5n\" (UniqueName: \"kubernetes.io/projected/ce5512e8-d105-44d3-bd4b-01a180b954c2-kube-api-access-2vh5n\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.801501 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-utilities\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.903944 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-catalog-content\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.904051 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vh5n\" (UniqueName: \"kubernetes.io/projected/ce5512e8-d105-44d3-bd4b-01a180b954c2-kube-api-access-2vh5n\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.904147 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-utilities\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.904519 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-catalog-content\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.904562 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-utilities\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.934973 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vh5n\" (UniqueName: \"kubernetes.io/projected/ce5512e8-d105-44d3-bd4b-01a180b954c2-kube-api-access-2vh5n\") pod \"community-operators-669j5\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:27 crc kubenswrapper[4904]: I1205 21:48:27.956075 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:28 crc kubenswrapper[4904]: I1205 21:48:28.549495 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-669j5"] Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.172783 4904 generic.go:334] "Generic (PLEG): container finished" podID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerID="81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579" exitCode=0 Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.172887 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerDied","Data":"81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579"} Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.173120 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerStarted","Data":"bca4a17dd9a162bd0118d787fca8d9ebcd1149d82b64c24daf58c966595ca3b7"} Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.956187 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.956534 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.956584 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.957363 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4149179d4736f80eac61f13fa0487b54479026079b2ff33c6d0c328533eef98"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:48:29 crc kubenswrapper[4904]: I1205 21:48:29.957414 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://f4149179d4736f80eac61f13fa0487b54479026079b2ff33c6d0c328533eef98" gracePeriod=600 Dec 05 21:48:30 crc kubenswrapper[4904]: I1205 21:48:30.187242 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerStarted","Data":"7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a"} Dec 05 21:48:30 crc kubenswrapper[4904]: I1205 21:48:30.190795 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="f4149179d4736f80eac61f13fa0487b54479026079b2ff33c6d0c328533eef98" exitCode=0 Dec 05 21:48:30 crc kubenswrapper[4904]: I1205 21:48:30.190829 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"f4149179d4736f80eac61f13fa0487b54479026079b2ff33c6d0c328533eef98"} Dec 05 21:48:30 crc kubenswrapper[4904]: I1205 21:48:30.190861 4904 scope.go:117] "RemoveContainer" containerID="d64969e50828f264edfbf9501a74b46640a7872d24c01a106535bc35fc4380f8" Dec 05 21:48:31 crc kubenswrapper[4904]: I1205 21:48:31.203301 4904 generic.go:334] "Generic (PLEG): container finished" podID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerID="7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a" exitCode=0 Dec 05 21:48:31 crc kubenswrapper[4904]: I1205 21:48:31.203385 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerDied","Data":"7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a"} Dec 05 21:48:31 crc kubenswrapper[4904]: I1205 21:48:31.208443 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473"} Dec 05 21:48:32 crc kubenswrapper[4904]: I1205 21:48:32.222214 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerStarted","Data":"1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e"} Dec 05 21:48:32 crc kubenswrapper[4904]: I1205 21:48:32.244983 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-669j5" podStartSLOduration=2.790974855 podStartE2EDuration="5.244943247s" podCreationTimestamp="2025-12-05 21:48:27 +0000 UTC" firstStartedPulling="2025-12-05 21:48:29.17515116 +0000 UTC m=+5807.986367269" lastFinishedPulling="2025-12-05 21:48:31.629119552 +0000 UTC m=+5810.440335661" observedRunningTime="2025-12-05 21:48:32.240914278 +0000 UTC m=+5811.052130407" watchObservedRunningTime="2025-12-05 21:48:32.244943247 +0000 UTC m=+5811.056159356" Dec 05 21:48:37 crc kubenswrapper[4904]: I1205 21:48:37.784840 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v26v8_2125e0a4-9809-42b4-911f-08c6d2e74879/kube-rbac-proxy/0.log" Dec 05 21:48:37 crc kubenswrapper[4904]: I1205 21:48:37.810481 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v26v8_2125e0a4-9809-42b4-911f-08c6d2e74879/controller/0.log" Dec 05 21:48:37 crc kubenswrapper[4904]: I1205 21:48:37.952213 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:48:37 crc kubenswrapper[4904]: I1205 21:48:37.956301 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:37 crc kubenswrapper[4904]: I1205 21:48:37.956338 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.011285 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.163018 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.186007 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.198855 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.218990 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.333127 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.383828 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-669j5"] Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.437264 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.439820 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.450789 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.474847 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.656096 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.658904 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.667519 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.688745 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/controller/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.877915 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/kube-rbac-proxy/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.894750 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/frr-metrics/0.log" Dec 05 21:48:38 crc kubenswrapper[4904]: I1205 21:48:38.941548 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/kube-rbac-proxy-frr/0.log" Dec 05 21:48:39 crc kubenswrapper[4904]: I1205 21:48:39.101531 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/reloader/0.log" Dec 05 21:48:39 crc kubenswrapper[4904]: I1205 21:48:39.147971 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-dpnqq_3b7fd88f-360a-4c89-8740-069bc371b65b/frr-k8s-webhook-server/0.log" Dec 05 21:48:39 crc kubenswrapper[4904]: I1205 21:48:39.350161 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58cc54b6b6-b77qk_54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28/manager/0.log" Dec 05 21:48:39 crc kubenswrapper[4904]: I1205 21:48:39.541834 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mvkbj_080b347a-f590-47cf-909f-578330838c1d/kube-rbac-proxy/0.log" Dec 05 21:48:39 crc kubenswrapper[4904]: I1205 21:48:39.572005 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5595c7bb55-zxmp8_2230230c-8a27-44a4-a63a-219e0e40f288/webhook-server/0.log" Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.217333 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mvkbj_080b347a-f590-47cf-909f-578330838c1d/speaker/0.log" Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.302140 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-669j5" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="registry-server" containerID="cri-o://1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e" gracePeriod=2 Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.633330 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/frr/0.log" Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.848728 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.988291 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-utilities\") pod \"ce5512e8-d105-44d3-bd4b-01a180b954c2\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.988502 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vh5n\" (UniqueName: \"kubernetes.io/projected/ce5512e8-d105-44d3-bd4b-01a180b954c2-kube-api-access-2vh5n\") pod \"ce5512e8-d105-44d3-bd4b-01a180b954c2\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.988589 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-catalog-content\") pod \"ce5512e8-d105-44d3-bd4b-01a180b954c2\" (UID: \"ce5512e8-d105-44d3-bd4b-01a180b954c2\") " Dec 05 21:48:40 crc kubenswrapper[4904]: I1205 21:48:40.992807 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-utilities" (OuterVolumeSpecName: "utilities") pod "ce5512e8-d105-44d3-bd4b-01a180b954c2" (UID: "ce5512e8-d105-44d3-bd4b-01a180b954c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:40.999437 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5512e8-d105-44d3-bd4b-01a180b954c2-kube-api-access-2vh5n" (OuterVolumeSpecName: "kube-api-access-2vh5n") pod "ce5512e8-d105-44d3-bd4b-01a180b954c2" (UID: "ce5512e8-d105-44d3-bd4b-01a180b954c2"). InnerVolumeSpecName "kube-api-access-2vh5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.051485 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce5512e8-d105-44d3-bd4b-01a180b954c2" (UID: "ce5512e8-d105-44d3-bd4b-01a180b954c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.090873 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.090911 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vh5n\" (UniqueName: \"kubernetes.io/projected/ce5512e8-d105-44d3-bd4b-01a180b954c2-kube-api-access-2vh5n\") on node \"crc\" DevicePath \"\"" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.090921 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5512e8-d105-44d3-bd4b-01a180b954c2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.315291 4904 generic.go:334] "Generic (PLEG): container finished" podID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerID="1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e" exitCode=0 Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.315382 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-669j5" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.315384 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerDied","Data":"1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e"} Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.315686 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-669j5" event={"ID":"ce5512e8-d105-44d3-bd4b-01a180b954c2","Type":"ContainerDied","Data":"bca4a17dd9a162bd0118d787fca8d9ebcd1149d82b64c24daf58c966595ca3b7"} Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.315706 4904 scope.go:117] "RemoveContainer" containerID="1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.346186 4904 scope.go:117] "RemoveContainer" containerID="7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.367391 4904 scope.go:117] "RemoveContainer" containerID="81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.367672 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-669j5"] Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.380832 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-669j5"] Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.420806 4904 scope.go:117] "RemoveContainer" containerID="1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e" Dec 05 21:48:41 crc kubenswrapper[4904]: E1205 21:48:41.421341 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e\": container with ID starting with 1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e not found: ID does not exist" containerID="1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.421378 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e"} err="failed to get container status \"1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e\": rpc error: code = NotFound desc = could not find container \"1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e\": container with ID starting with 1a49be5b723bbcaa25c732b4a3d8a7994771f7266637eb0d7e0f64c656c33e9e not found: ID does not exist" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.421400 4904 scope.go:117] "RemoveContainer" containerID="7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a" Dec 05 21:48:41 crc kubenswrapper[4904]: E1205 21:48:41.421747 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a\": container with ID starting with 7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a not found: ID does not exist" containerID="7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.421803 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a"} err="failed to get container status \"7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a\": rpc error: code = NotFound desc = could not find container \"7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a\": container with ID starting with 7426f987bc58ef6e27d7d023476a84de574701f0f81da09b4f3bab6400911d4a not found: ID does not exist" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.421839 4904 scope.go:117] "RemoveContainer" containerID="81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579" Dec 05 21:48:41 crc kubenswrapper[4904]: E1205 21:48:41.422153 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579\": container with ID starting with 81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579 not found: ID does not exist" containerID="81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.422177 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579"} err="failed to get container status \"81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579\": rpc error: code = NotFound desc = could not find container \"81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579\": container with ID starting with 81eebff00fe59c874e031d8b8228d1c0152a37ad0dc5c7ec9a02bb35f6b42579 not found: ID does not exist" Dec 05 21:48:41 crc kubenswrapper[4904]: I1205 21:48:41.701334 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" path="/var/lib/kubelet/pods/ce5512e8-d105-44d3-bd4b-01a180b954c2/volumes" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.453112 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/util/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.667362 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/pull/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.676151 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/util/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.708636 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/pull/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.832309 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/pull/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.832703 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/util/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.833390 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/extract/0.log" Dec 05 21:48:52 crc kubenswrapper[4904]: I1205 21:48:52.994747 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/util/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.140184 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/util/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.157873 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/pull/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.184993 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/pull/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.308734 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/util/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.309463 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/pull/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.371417 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/extract/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.484283 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/util/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.642730 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/pull/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.649531 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/util/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.704166 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/pull/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.834304 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/util/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.853982 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/extract/0.log" Dec 05 21:48:53 crc kubenswrapper[4904]: I1205 21:48:53.867722 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/pull/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.024559 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-utilities/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.190486 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-utilities/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.204557 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-content/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.246605 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-content/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.386549 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-utilities/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.420342 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-content/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.616404 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-utilities/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.853008 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-content/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.872118 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-utilities/0.log" Dec 05 21:48:54 crc kubenswrapper[4904]: I1205 21:48:54.917111 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-content/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.075044 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/registry-server/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.139035 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-utilities/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.172872 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-content/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.396699 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7wzh2_425f2b0f-3e5c-4db8-95f2-e0ae6581a443/marketplace-operator/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.675636 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-utilities/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.799108 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-utilities/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.840918 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-content/0.log" Dec 05 21:48:55 crc kubenswrapper[4904]: I1205 21:48:55.919999 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-content/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.019990 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/registry-server/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.209932 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-content/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.220188 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-utilities/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.434012 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-utilities/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.628544 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/registry-server/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.655549 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-utilities/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.660556 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-content/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.712626 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-content/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.864846 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-content/0.log" Dec 05 21:48:56 crc kubenswrapper[4904]: I1205 21:48:56.865117 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-utilities/0.log" Dec 05 21:48:57 crc kubenswrapper[4904]: I1205 21:48:57.523425 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/registry-server/0.log" Dec 05 21:49:08 crc kubenswrapper[4904]: I1205 21:49:08.772975 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-52c6d_c290c96d-3acd-4374-91d7-20efcef53eda/prometheus-operator/0.log" Dec 05 21:49:08 crc kubenswrapper[4904]: I1205 21:49:08.908105 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44_a54452d0-1ba5-4b81-aab4-2e2f2293fa6b/prometheus-operator-admission-webhook/0.log" Dec 05 21:49:08 crc kubenswrapper[4904]: I1205 21:49:08.976452 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm_6b774de0-c1b7-43c1-86b5-b444cc0275d4/prometheus-operator-admission-webhook/0.log" Dec 05 21:49:09 crc kubenswrapper[4904]: I1205 21:49:09.161923 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-vg7c2_14ec1958-2889-4fef-90ee-e73296264291/operator/0.log" Dec 05 21:49:09 crc kubenswrapper[4904]: I1205 21:49:09.166026 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-8pj8n_7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b/perses-operator/0.log" Dec 05 21:49:32 crc kubenswrapper[4904]: E1205 21:49:32.397020 4904 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:48784->38.102.83.166:45757: write tcp 38.102.83.166:48784->38.102.83.166:45757: write: broken pipe Dec 05 21:50:59 crc kubenswrapper[4904]: I1205 21:50:59.955705 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:50:59 crc kubenswrapper[4904]: I1205 21:50:59.956490 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:51:07 crc kubenswrapper[4904]: I1205 21:51:07.931998 4904 generic.go:334] "Generic (PLEG): container finished" podID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerID="5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a" exitCode=0 Dec 05 21:51:07 crc kubenswrapper[4904]: I1205 21:51:07.932179 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pspg6/must-gather-hsd7g" event={"ID":"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906","Type":"ContainerDied","Data":"5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a"} Dec 05 21:51:07 crc kubenswrapper[4904]: I1205 21:51:07.933101 4904 scope.go:117] "RemoveContainer" containerID="5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a" Dec 05 21:51:08 crc kubenswrapper[4904]: I1205 21:51:08.473396 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pspg6_must-gather-hsd7g_c22d3c1c-09ee-4d2a-a15d-b98f74b5e906/gather/0.log" Dec 05 21:51:16 crc kubenswrapper[4904]: I1205 21:51:16.432380 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pspg6/must-gather-hsd7g"] Dec 05 21:51:16 crc kubenswrapper[4904]: I1205 21:51:16.433496 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pspg6/must-gather-hsd7g" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="copy" containerID="cri-o://ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9" gracePeriod=2 Dec 05 21:51:16 crc kubenswrapper[4904]: I1205 21:51:16.442430 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pspg6/must-gather-hsd7g"] Dec 05 21:51:16 crc kubenswrapper[4904]: I1205 21:51:16.911879 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pspg6_must-gather-hsd7g_c22d3c1c-09ee-4d2a-a15d-b98f74b5e906/copy/0.log" Dec 05 21:51:16 crc kubenswrapper[4904]: I1205 21:51:16.912740 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.019925 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pspg6_must-gather-hsd7g_c22d3c1c-09ee-4d2a-a15d-b98f74b5e906/copy/0.log" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.020414 4904 generic.go:334] "Generic (PLEG): container finished" podID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerID="ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9" exitCode=143 Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.020468 4904 scope.go:117] "RemoveContainer" containerID="ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.020539 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pspg6/must-gather-hsd7g" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.039873 4904 scope.go:117] "RemoveContainer" containerID="5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.064118 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-must-gather-output\") pod \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.064271 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcbfs\" (UniqueName: \"kubernetes.io/projected/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-kube-api-access-fcbfs\") pod \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\" (UID: \"c22d3c1c-09ee-4d2a-a15d-b98f74b5e906\") " Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.071388 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-kube-api-access-fcbfs" (OuterVolumeSpecName: "kube-api-access-fcbfs") pod "c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" (UID: "c22d3c1c-09ee-4d2a-a15d-b98f74b5e906"). InnerVolumeSpecName "kube-api-access-fcbfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.087321 4904 scope.go:117] "RemoveContainer" containerID="ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9" Dec 05 21:51:17 crc kubenswrapper[4904]: E1205 21:51:17.087761 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9\": container with ID starting with ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9 not found: ID does not exist" containerID="ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.087840 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9"} err="failed to get container status \"ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9\": rpc error: code = NotFound desc = could not find container \"ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9\": container with ID starting with ade9f40048314d1f8b5abfaa6e1f508b3982e654004fc564e20161e265c51aa9 not found: ID does not exist" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.087873 4904 scope.go:117] "RemoveContainer" containerID="5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a" Dec 05 21:51:17 crc kubenswrapper[4904]: E1205 21:51:17.088341 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a\": container with ID starting with 5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a not found: ID does not exist" containerID="5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.088370 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a"} err="failed to get container status \"5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a\": rpc error: code = NotFound desc = could not find container \"5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a\": container with ID starting with 5a9b3f913d1f59a839680fc798dc0802d9fd6342319192b91afca374b78ce88a not found: ID does not exist" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.166399 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcbfs\" (UniqueName: \"kubernetes.io/projected/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-kube-api-access-fcbfs\") on node \"crc\" DevicePath \"\"" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.273795 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" (UID: "c22d3c1c-09ee-4d2a-a15d-b98f74b5e906"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.369686 4904 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 21:51:17 crc kubenswrapper[4904]: I1205 21:51:17.693053 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" path="/var/lib/kubelet/pods/c22d3c1c-09ee-4d2a-a15d-b98f74b5e906/volumes" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.303289 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x2dmw"] Dec 05 21:51:29 crc kubenswrapper[4904]: E1205 21:51:29.304466 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="gather" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304485 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="gather" Dec 05 21:51:29 crc kubenswrapper[4904]: E1205 21:51:29.304513 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="copy" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304520 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="copy" Dec 05 21:51:29 crc kubenswrapper[4904]: E1205 21:51:29.304543 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="extract-content" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304551 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="extract-content" Dec 05 21:51:29 crc kubenswrapper[4904]: E1205 21:51:29.304562 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="registry-server" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304572 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="registry-server" Dec 05 21:51:29 crc kubenswrapper[4904]: E1205 21:51:29.304598 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="extract-utilities" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304606 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="extract-utilities" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304835 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="gather" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304854 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5512e8-d105-44d3-bd4b-01a180b954c2" containerName="registry-server" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.304868 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22d3c1c-09ee-4d2a-a15d-b98f74b5e906" containerName="copy" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.306469 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.318622 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x2dmw"] Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.416813 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-catalog-content\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.416940 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-utilities\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.417342 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6ml6\" (UniqueName: \"kubernetes.io/projected/75329d67-f5de-4673-b313-d86b99800982-kube-api-access-v6ml6\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.519248 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-catalog-content\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.519379 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-utilities\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.519454 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6ml6\" (UniqueName: \"kubernetes.io/projected/75329d67-f5de-4673-b313-d86b99800982-kube-api-access-v6ml6\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.519781 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-catalog-content\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.519853 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-utilities\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.550354 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6ml6\" (UniqueName: \"kubernetes.io/projected/75329d67-f5de-4673-b313-d86b99800982-kube-api-access-v6ml6\") pod \"redhat-operators-x2dmw\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.681530 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.955977 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:51:29 crc kubenswrapper[4904]: I1205 21:51:29.956328 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:51:30 crc kubenswrapper[4904]: I1205 21:51:30.207602 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x2dmw"] Dec 05 21:51:31 crc kubenswrapper[4904]: I1205 21:51:31.167286 4904 generic.go:334] "Generic (PLEG): container finished" podID="75329d67-f5de-4673-b313-d86b99800982" containerID="d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82" exitCode=0 Dec 05 21:51:31 crc kubenswrapper[4904]: I1205 21:51:31.167544 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerDied","Data":"d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82"} Dec 05 21:51:31 crc kubenswrapper[4904]: I1205 21:51:31.167575 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerStarted","Data":"04a6c8ce18a0dc1e1a03089a7c611d8d22df19eacff1941b5bd2887930ad34ed"} Dec 05 21:51:31 crc kubenswrapper[4904]: I1205 21:51:31.169588 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:51:32 crc kubenswrapper[4904]: I1205 21:51:32.176822 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerStarted","Data":"d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020"} Dec 05 21:51:34 crc kubenswrapper[4904]: I1205 21:51:34.204636 4904 generic.go:334] "Generic (PLEG): container finished" podID="75329d67-f5de-4673-b313-d86b99800982" containerID="d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020" exitCode=0 Dec 05 21:51:34 crc kubenswrapper[4904]: I1205 21:51:34.205475 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerDied","Data":"d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020"} Dec 05 21:51:35 crc kubenswrapper[4904]: I1205 21:51:35.218266 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerStarted","Data":"ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf"} Dec 05 21:51:35 crc kubenswrapper[4904]: I1205 21:51:35.238539 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x2dmw" podStartSLOduration=2.565562882 podStartE2EDuration="6.238524118s" podCreationTimestamp="2025-12-05 21:51:29 +0000 UTC" firstStartedPulling="2025-12-05 21:51:31.169350541 +0000 UTC m=+5989.980566651" lastFinishedPulling="2025-12-05 21:51:34.842311778 +0000 UTC m=+5993.653527887" observedRunningTime="2025-12-05 21:51:35.233392548 +0000 UTC m=+5994.044608677" watchObservedRunningTime="2025-12-05 21:51:35.238524118 +0000 UTC m=+5994.049740227" Dec 05 21:51:39 crc kubenswrapper[4904]: I1205 21:51:39.693268 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:39 crc kubenswrapper[4904]: I1205 21:51:39.693766 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:40 crc kubenswrapper[4904]: I1205 21:51:40.736457 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x2dmw" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="registry-server" probeResult="failure" output=< Dec 05 21:51:40 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 21:51:40 crc kubenswrapper[4904]: > Dec 05 21:51:49 crc kubenswrapper[4904]: I1205 21:51:49.737690 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:49 crc kubenswrapper[4904]: I1205 21:51:49.802177 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:49 crc kubenswrapper[4904]: I1205 21:51:49.973994 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x2dmw"] Dec 05 21:51:51 crc kubenswrapper[4904]: I1205 21:51:51.418472 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x2dmw" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="registry-server" containerID="cri-o://ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf" gracePeriod=2 Dec 05 21:51:51 crc kubenswrapper[4904]: I1205 21:51:51.935815 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.113096 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-catalog-content\") pod \"75329d67-f5de-4673-b313-d86b99800982\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.113429 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-utilities\") pod \"75329d67-f5de-4673-b313-d86b99800982\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.113469 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6ml6\" (UniqueName: \"kubernetes.io/projected/75329d67-f5de-4673-b313-d86b99800982-kube-api-access-v6ml6\") pod \"75329d67-f5de-4673-b313-d86b99800982\" (UID: \"75329d67-f5de-4673-b313-d86b99800982\") " Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.114396 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-utilities" (OuterVolumeSpecName: "utilities") pod "75329d67-f5de-4673-b313-d86b99800982" (UID: "75329d67-f5de-4673-b313-d86b99800982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.125419 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75329d67-f5de-4673-b313-d86b99800982-kube-api-access-v6ml6" (OuterVolumeSpecName: "kube-api-access-v6ml6") pod "75329d67-f5de-4673-b313-d86b99800982" (UID: "75329d67-f5de-4673-b313-d86b99800982"). InnerVolumeSpecName "kube-api-access-v6ml6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.216356 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.216387 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6ml6\" (UniqueName: \"kubernetes.io/projected/75329d67-f5de-4673-b313-d86b99800982-kube-api-access-v6ml6\") on node \"crc\" DevicePath \"\"" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.220718 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75329d67-f5de-4673-b313-d86b99800982" (UID: "75329d67-f5de-4673-b313-d86b99800982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.318488 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75329d67-f5de-4673-b313-d86b99800982-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.431714 4904 generic.go:334] "Generic (PLEG): container finished" podID="75329d67-f5de-4673-b313-d86b99800982" containerID="ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf" exitCode=0 Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.431759 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerDied","Data":"ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf"} Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.431790 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2dmw" event={"ID":"75329d67-f5de-4673-b313-d86b99800982","Type":"ContainerDied","Data":"04a6c8ce18a0dc1e1a03089a7c611d8d22df19eacff1941b5bd2887930ad34ed"} Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.431811 4904 scope.go:117] "RemoveContainer" containerID="ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.431817 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2dmw" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.451590 4904 scope.go:117] "RemoveContainer" containerID="d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.484647 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x2dmw"] Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.497360 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x2dmw"] Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.501778 4904 scope.go:117] "RemoveContainer" containerID="d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.528734 4904 scope.go:117] "RemoveContainer" containerID="ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf" Dec 05 21:51:52 crc kubenswrapper[4904]: E1205 21:51:52.529142 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf\": container with ID starting with ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf not found: ID does not exist" containerID="ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.529170 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf"} err="failed to get container status \"ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf\": rpc error: code = NotFound desc = could not find container \"ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf\": container with ID starting with ed1a20443b549cd8a152ed5edb16bce645e5cc32bc1c7738eed0849cbb5b18cf not found: ID does not exist" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.529191 4904 scope.go:117] "RemoveContainer" containerID="d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020" Dec 05 21:51:52 crc kubenswrapper[4904]: E1205 21:51:52.531694 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020\": container with ID starting with d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020 not found: ID does not exist" containerID="d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.532013 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020"} err="failed to get container status \"d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020\": rpc error: code = NotFound desc = could not find container \"d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020\": container with ID starting with d22daedd4752af32d200847ce107fe7059202f2e68d94fe220347103a609d020 not found: ID does not exist" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.532031 4904 scope.go:117] "RemoveContainer" containerID="d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82" Dec 05 21:51:52 crc kubenswrapper[4904]: E1205 21:51:52.532321 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82\": container with ID starting with d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82 not found: ID does not exist" containerID="d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82" Dec 05 21:51:52 crc kubenswrapper[4904]: I1205 21:51:52.532362 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82"} err="failed to get container status \"d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82\": rpc error: code = NotFound desc = could not find container \"d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82\": container with ID starting with d1104e04592b41a217dca509e96e6b0fca475c66b339996c013ddbbe840f6f82 not found: ID does not exist" Dec 05 21:51:53 crc kubenswrapper[4904]: I1205 21:51:53.695685 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75329d67-f5de-4673-b313-d86b99800982" path="/var/lib/kubelet/pods/75329d67-f5de-4673-b313-d86b99800982/volumes" Dec 05 21:51:59 crc kubenswrapper[4904]: I1205 21:51:59.955738 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:51:59 crc kubenswrapper[4904]: I1205 21:51:59.956255 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:51:59 crc kubenswrapper[4904]: I1205 21:51:59.956295 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 21:51:59 crc kubenswrapper[4904]: I1205 21:51:59.956958 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:51:59 crc kubenswrapper[4904]: I1205 21:51:59.956998 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" gracePeriod=600 Dec 05 21:52:00 crc kubenswrapper[4904]: E1205 21:52:00.078045 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:52:00 crc kubenswrapper[4904]: I1205 21:52:00.596434 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" exitCode=0 Dec 05 21:52:00 crc kubenswrapper[4904]: I1205 21:52:00.596973 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473"} Dec 05 21:52:00 crc kubenswrapper[4904]: I1205 21:52:00.597027 4904 scope.go:117] "RemoveContainer" containerID="f4149179d4736f80eac61f13fa0487b54479026079b2ff33c6d0c328533eef98" Dec 05 21:52:00 crc kubenswrapper[4904]: I1205 21:52:00.597682 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:52:00 crc kubenswrapper[4904]: E1205 21:52:00.598101 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:52:11 crc kubenswrapper[4904]: I1205 21:52:11.688299 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:52:11 crc kubenswrapper[4904]: E1205 21:52:11.689731 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:52:26 crc kubenswrapper[4904]: I1205 21:52:26.682435 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:52:26 crc kubenswrapper[4904]: E1205 21:52:26.683423 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:52:39 crc kubenswrapper[4904]: I1205 21:52:39.685088 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:52:39 crc kubenswrapper[4904]: E1205 21:52:39.686082 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:52:51 crc kubenswrapper[4904]: I1205 21:52:51.688734 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:52:51 crc kubenswrapper[4904]: E1205 21:52:51.689484 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:53:05 crc kubenswrapper[4904]: I1205 21:53:05.681790 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:53:05 crc kubenswrapper[4904]: E1205 21:53:05.683619 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:53:19 crc kubenswrapper[4904]: I1205 21:53:19.682782 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:53:19 crc kubenswrapper[4904]: E1205 21:53:19.683726 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:53:30 crc kubenswrapper[4904]: I1205 21:53:30.681542 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:53:30 crc kubenswrapper[4904]: E1205 21:53:30.683397 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:53:41 crc kubenswrapper[4904]: I1205 21:53:41.689565 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:53:41 crc kubenswrapper[4904]: E1205 21:53:41.690447 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:53:55 crc kubenswrapper[4904]: I1205 21:53:55.681833 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:53:55 crc kubenswrapper[4904]: E1205 21:53:55.682701 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.776303 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x699t"] Dec 05 21:53:56 crc kubenswrapper[4904]: E1205 21:53:56.778144 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="registry-server" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.778301 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="registry-server" Dec 05 21:53:56 crc kubenswrapper[4904]: E1205 21:53:56.778442 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="extract-utilities" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.778555 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="extract-utilities" Dec 05 21:53:56 crc kubenswrapper[4904]: E1205 21:53:56.778682 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="extract-content" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.778795 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="extract-content" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.779294 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="75329d67-f5de-4673-b313-d86b99800982" containerName="registry-server" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.787077 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.804272 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x699t"] Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.853782 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-catalog-content\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.853865 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55cdd\" (UniqueName: \"kubernetes.io/projected/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-kube-api-access-55cdd\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.854001 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-utilities\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.955826 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-catalog-content\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.955913 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55cdd\" (UniqueName: \"kubernetes.io/projected/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-kube-api-access-55cdd\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.955950 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-utilities\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.956368 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-catalog-content\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.956443 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-utilities\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:56 crc kubenswrapper[4904]: I1205 21:53:56.975965 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55cdd\" (UniqueName: \"kubernetes.io/projected/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-kube-api-access-55cdd\") pod \"redhat-marketplace-x699t\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:57 crc kubenswrapper[4904]: I1205 21:53:57.115986 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:53:57 crc kubenswrapper[4904]: I1205 21:53:57.645563 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x699t"] Dec 05 21:53:57 crc kubenswrapper[4904]: I1205 21:53:57.921141 4904 generic.go:334] "Generic (PLEG): container finished" podID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerID="2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d" exitCode=0 Dec 05 21:53:57 crc kubenswrapper[4904]: I1205 21:53:57.921189 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerDied","Data":"2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d"} Dec 05 21:53:57 crc kubenswrapper[4904]: I1205 21:53:57.921502 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerStarted","Data":"f0c077d317e2bf0268661283a1df4587aa1808b9dc1177b7fadba498b485fe6b"} Dec 05 21:53:58 crc kubenswrapper[4904]: I1205 21:53:58.930936 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerStarted","Data":"9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456"} Dec 05 21:53:59 crc kubenswrapper[4904]: I1205 21:53:59.944611 4904 generic.go:334] "Generic (PLEG): container finished" podID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerID="9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456" exitCode=0 Dec 05 21:53:59 crc kubenswrapper[4904]: I1205 21:53:59.944699 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerDied","Data":"9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456"} Dec 05 21:54:01 crc kubenswrapper[4904]: I1205 21:54:01.969031 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerStarted","Data":"536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae"} Dec 05 21:54:01 crc kubenswrapper[4904]: I1205 21:54:01.996273 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x699t" podStartSLOduration=2.553155398 podStartE2EDuration="5.99623171s" podCreationTimestamp="2025-12-05 21:53:56 +0000 UTC" firstStartedPulling="2025-12-05 21:53:57.922808478 +0000 UTC m=+6136.734024587" lastFinishedPulling="2025-12-05 21:54:01.36588474 +0000 UTC m=+6140.177100899" observedRunningTime="2025-12-05 21:54:01.993399613 +0000 UTC m=+6140.804615742" watchObservedRunningTime="2025-12-05 21:54:01.99623171 +0000 UTC m=+6140.807447829" Dec 05 21:54:07 crc kubenswrapper[4904]: I1205 21:54:07.116152 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:54:07 crc kubenswrapper[4904]: I1205 21:54:07.116535 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:54:07 crc kubenswrapper[4904]: I1205 21:54:07.182488 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:54:08 crc kubenswrapper[4904]: I1205 21:54:08.104675 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:54:08 crc kubenswrapper[4904]: I1205 21:54:08.682719 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:54:08 crc kubenswrapper[4904]: E1205 21:54:08.682997 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.324725 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x699t"] Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.325210 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x699t" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="registry-server" containerID="cri-o://536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae" gracePeriod=2 Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.829569 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.998144 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55cdd\" (UniqueName: \"kubernetes.io/projected/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-kube-api-access-55cdd\") pod \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.998277 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-utilities\") pod \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.998316 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-catalog-content\") pod \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\" (UID: \"6bd4d102-dd63-4cbb-b375-ce798e3c87a1\") " Dec 05 21:54:10 crc kubenswrapper[4904]: I1205 21:54:10.999213 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-utilities" (OuterVolumeSpecName: "utilities") pod "6bd4d102-dd63-4cbb-b375-ce798e3c87a1" (UID: "6bd4d102-dd63-4cbb-b375-ce798e3c87a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.004074 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-kube-api-access-55cdd" (OuterVolumeSpecName: "kube-api-access-55cdd") pod "6bd4d102-dd63-4cbb-b375-ce798e3c87a1" (UID: "6bd4d102-dd63-4cbb-b375-ce798e3c87a1"). InnerVolumeSpecName "kube-api-access-55cdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.028517 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bd4d102-dd63-4cbb-b375-ce798e3c87a1" (UID: "6bd4d102-dd63-4cbb-b375-ce798e3c87a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.064081 4904 generic.go:334] "Generic (PLEG): container finished" podID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerID="536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae" exitCode=0 Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.064137 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerDied","Data":"536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae"} Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.064177 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x699t" event={"ID":"6bd4d102-dd63-4cbb-b375-ce798e3c87a1","Type":"ContainerDied","Data":"f0c077d317e2bf0268661283a1df4587aa1808b9dc1177b7fadba498b485fe6b"} Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.064199 4904 scope.go:117] "RemoveContainer" containerID="536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.064201 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x699t" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.081745 4904 scope.go:117] "RemoveContainer" containerID="9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.100741 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55cdd\" (UniqueName: \"kubernetes.io/projected/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-kube-api-access-55cdd\") on node \"crc\" DevicePath \"\"" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.100782 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.100792 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd4d102-dd63-4cbb-b375-ce798e3c87a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.102232 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x699t"] Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.111975 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x699t"] Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.123164 4904 scope.go:117] "RemoveContainer" containerID="2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.151535 4904 scope.go:117] "RemoveContainer" containerID="536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae" Dec 05 21:54:11 crc kubenswrapper[4904]: E1205 21:54:11.151980 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae\": container with ID starting with 536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae not found: ID does not exist" containerID="536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.152026 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae"} err="failed to get container status \"536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae\": rpc error: code = NotFound desc = could not find container \"536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae\": container with ID starting with 536d4cfa405461c18a43f9b086652334450ebfcc605a1a4f65ddf474fdf22bae not found: ID does not exist" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.152070 4904 scope.go:117] "RemoveContainer" containerID="9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456" Dec 05 21:54:11 crc kubenswrapper[4904]: E1205 21:54:11.152698 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456\": container with ID starting with 9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456 not found: ID does not exist" containerID="9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.152736 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456"} err="failed to get container status \"9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456\": rpc error: code = NotFound desc = could not find container \"9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456\": container with ID starting with 9428f50cc8a96f9face0746673fc4f30a571467edad52b5328d3734df2681456 not found: ID does not exist" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.152762 4904 scope.go:117] "RemoveContainer" containerID="2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d" Dec 05 21:54:11 crc kubenswrapper[4904]: E1205 21:54:11.153037 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d\": container with ID starting with 2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d not found: ID does not exist" containerID="2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.153086 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d"} err="failed to get container status \"2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d\": rpc error: code = NotFound desc = could not find container \"2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d\": container with ID starting with 2df669cbad8de8138f36a033a71e2f8375edfacd3bbe2460f1b56785a5d8669d not found: ID does not exist" Dec 05 21:54:11 crc kubenswrapper[4904]: I1205 21:54:11.694270 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" path="/var/lib/kubelet/pods/6bd4d102-dd63-4cbb-b375-ce798e3c87a1/volumes" Dec 05 21:54:23 crc kubenswrapper[4904]: I1205 21:54:23.682566 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:54:23 crc kubenswrapper[4904]: E1205 21:54:23.683716 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.783010 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72sr6/must-gather-c94bm"] Dec 05 21:54:25 crc kubenswrapper[4904]: E1205 21:54:25.783628 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="extract-utilities" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.783640 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="extract-utilities" Dec 05 21:54:25 crc kubenswrapper[4904]: E1205 21:54:25.783658 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="extract-content" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.783664 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="extract-content" Dec 05 21:54:25 crc kubenswrapper[4904]: E1205 21:54:25.783691 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="registry-server" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.783701 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="registry-server" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.783888 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd4d102-dd63-4cbb-b375-ce798e3c87a1" containerName="registry-server" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.784949 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.787518 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-72sr6"/"kube-root-ca.crt" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.787764 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-72sr6"/"openshift-service-ca.crt" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.802425 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90f2514c-edb1-47b0-8f0f-49952ab8686f-must-gather-output\") pod \"must-gather-c94bm\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.853872 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-72sr6/must-gather-c94bm"] Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.904765 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5q98\" (UniqueName: \"kubernetes.io/projected/90f2514c-edb1-47b0-8f0f-49952ab8686f-kube-api-access-q5q98\") pod \"must-gather-c94bm\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.904943 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90f2514c-edb1-47b0-8f0f-49952ab8686f-must-gather-output\") pod \"must-gather-c94bm\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:25 crc kubenswrapper[4904]: I1205 21:54:25.905647 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90f2514c-edb1-47b0-8f0f-49952ab8686f-must-gather-output\") pod \"must-gather-c94bm\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:26 crc kubenswrapper[4904]: I1205 21:54:26.006925 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5q98\" (UniqueName: \"kubernetes.io/projected/90f2514c-edb1-47b0-8f0f-49952ab8686f-kube-api-access-q5q98\") pod \"must-gather-c94bm\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:26 crc kubenswrapper[4904]: I1205 21:54:26.025821 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5q98\" (UniqueName: \"kubernetes.io/projected/90f2514c-edb1-47b0-8f0f-49952ab8686f-kube-api-access-q5q98\") pod \"must-gather-c94bm\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:26 crc kubenswrapper[4904]: I1205 21:54:26.119034 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 21:54:26 crc kubenswrapper[4904]: I1205 21:54:26.590882 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-72sr6/must-gather-c94bm"] Dec 05 21:54:27 crc kubenswrapper[4904]: I1205 21:54:27.267903 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/must-gather-c94bm" event={"ID":"90f2514c-edb1-47b0-8f0f-49952ab8686f","Type":"ContainerStarted","Data":"6d7af8b1bdaa890853ab86e00d49e071b0f5aa45ba12f3e19d70dfa692a55f12"} Dec 05 21:54:27 crc kubenswrapper[4904]: I1205 21:54:27.268280 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/must-gather-c94bm" event={"ID":"90f2514c-edb1-47b0-8f0f-49952ab8686f","Type":"ContainerStarted","Data":"d3d29141d5c3a0b983e659808c69d1ed6bd099050aa09676e18fe95846f2436a"} Dec 05 21:54:27 crc kubenswrapper[4904]: I1205 21:54:27.268299 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/must-gather-c94bm" event={"ID":"90f2514c-edb1-47b0-8f0f-49952ab8686f","Type":"ContainerStarted","Data":"25bf2949431dd03b30c0499784be0874f85799ecd3404139415dc0da9f60b1b5"} Dec 05 21:54:27 crc kubenswrapper[4904]: I1205 21:54:27.308403 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72sr6/must-gather-c94bm" podStartSLOduration=2.308375617 podStartE2EDuration="2.308375617s" podCreationTimestamp="2025-12-05 21:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:54:27.301073958 +0000 UTC m=+6166.112290077" watchObservedRunningTime="2025-12-05 21:54:27.308375617 +0000 UTC m=+6166.119591736" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.601029 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72sr6/crc-debug-gr7f6"] Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.603332 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.606030 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-72sr6"/"default-dockercfg-6wvgf" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.720036 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwc9v\" (UniqueName: \"kubernetes.io/projected/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-kube-api-access-nwc9v\") pod \"crc-debug-gr7f6\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.720289 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-host\") pod \"crc-debug-gr7f6\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.822784 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-host\") pod \"crc-debug-gr7f6\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.822909 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-host\") pod \"crc-debug-gr7f6\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.823188 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwc9v\" (UniqueName: \"kubernetes.io/projected/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-kube-api-access-nwc9v\") pod \"crc-debug-gr7f6\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.856322 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwc9v\" (UniqueName: \"kubernetes.io/projected/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-kube-api-access-nwc9v\") pod \"crc-debug-gr7f6\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: I1205 21:54:30.930510 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:54:30 crc kubenswrapper[4904]: W1205 21:54:30.965469 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8d2e168_1a98_4a7d_8fe2_35ccc338658f.slice/crio-10a446e66a37156faaa3b96aa81bcb1d102fc5aa5be76da4f50877d336b0f641 WatchSource:0}: Error finding container 10a446e66a37156faaa3b96aa81bcb1d102fc5aa5be76da4f50877d336b0f641: Status 404 returned error can't find the container with id 10a446e66a37156faaa3b96aa81bcb1d102fc5aa5be76da4f50877d336b0f641 Dec 05 21:54:31 crc kubenswrapper[4904]: I1205 21:54:31.307411 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" event={"ID":"e8d2e168-1a98-4a7d-8fe2-35ccc338658f","Type":"ContainerStarted","Data":"b9a2b59e8e32bcc4c3a736e4b5fa7e6fb273fa27fa6e572d776a1334f9a07847"} Dec 05 21:54:31 crc kubenswrapper[4904]: I1205 21:54:31.307678 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" event={"ID":"e8d2e168-1a98-4a7d-8fe2-35ccc338658f","Type":"ContainerStarted","Data":"10a446e66a37156faaa3b96aa81bcb1d102fc5aa5be76da4f50877d336b0f641"} Dec 05 21:54:31 crc kubenswrapper[4904]: I1205 21:54:31.327471 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" podStartSLOduration=1.3274489 podStartE2EDuration="1.3274489s" podCreationTimestamp="2025-12-05 21:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:54:31.318920378 +0000 UTC m=+6170.130136507" watchObservedRunningTime="2025-12-05 21:54:31.3274489 +0000 UTC m=+6170.138665019" Dec 05 21:54:37 crc kubenswrapper[4904]: I1205 21:54:37.681356 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:54:37 crc kubenswrapper[4904]: E1205 21:54:37.681951 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:54:50 crc kubenswrapper[4904]: I1205 21:54:50.682166 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:54:50 crc kubenswrapper[4904]: E1205 21:54:50.682874 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:55:03 crc kubenswrapper[4904]: I1205 21:55:03.682143 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:55:03 crc kubenswrapper[4904]: E1205 21:55:03.682968 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:55:08 crc kubenswrapper[4904]: I1205 21:55:08.662333 4904 generic.go:334] "Generic (PLEG): container finished" podID="e8d2e168-1a98-4a7d-8fe2-35ccc338658f" containerID="b9a2b59e8e32bcc4c3a736e4b5fa7e6fb273fa27fa6e572d776a1334f9a07847" exitCode=0 Dec 05 21:55:08 crc kubenswrapper[4904]: I1205 21:55:08.662407 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" event={"ID":"e8d2e168-1a98-4a7d-8fe2-35ccc338658f","Type":"ContainerDied","Data":"b9a2b59e8e32bcc4c3a736e4b5fa7e6fb273fa27fa6e572d776a1334f9a07847"} Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.814250 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.852624 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72sr6/crc-debug-gr7f6"] Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.863033 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72sr6/crc-debug-gr7f6"] Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.984151 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-host\") pod \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.984267 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-host" (OuterVolumeSpecName: "host") pod "e8d2e168-1a98-4a7d-8fe2-35ccc338658f" (UID: "e8d2e168-1a98-4a7d-8fe2-35ccc338658f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.984511 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwc9v\" (UniqueName: \"kubernetes.io/projected/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-kube-api-access-nwc9v\") pod \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\" (UID: \"e8d2e168-1a98-4a7d-8fe2-35ccc338658f\") " Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.985661 4904 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:55:09 crc kubenswrapper[4904]: I1205 21:55:09.989556 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-kube-api-access-nwc9v" (OuterVolumeSpecName: "kube-api-access-nwc9v") pod "e8d2e168-1a98-4a7d-8fe2-35ccc338658f" (UID: "e8d2e168-1a98-4a7d-8fe2-35ccc338658f"). InnerVolumeSpecName "kube-api-access-nwc9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:55:10 crc kubenswrapper[4904]: I1205 21:55:10.087593 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwc9v\" (UniqueName: \"kubernetes.io/projected/e8d2e168-1a98-4a7d-8fe2-35ccc338658f-kube-api-access-nwc9v\") on node \"crc\" DevicePath \"\"" Dec 05 21:55:10 crc kubenswrapper[4904]: I1205 21:55:10.687008 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a446e66a37156faaa3b96aa81bcb1d102fc5aa5be76da4f50877d336b0f641" Dec 05 21:55:10 crc kubenswrapper[4904]: I1205 21:55:10.687125 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-gr7f6" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.112985 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72sr6/crc-debug-c9fl9"] Dec 05 21:55:11 crc kubenswrapper[4904]: E1205 21:55:11.113409 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d2e168-1a98-4a7d-8fe2-35ccc338658f" containerName="container-00" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.113424 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d2e168-1a98-4a7d-8fe2-35ccc338658f" containerName="container-00" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.113659 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d2e168-1a98-4a7d-8fe2-35ccc338658f" containerName="container-00" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.114404 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.116488 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-72sr6"/"default-dockercfg-6wvgf" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.212307 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412c87f0-ca77-4ee2-bb68-efff9bb6c749-host\") pod \"crc-debug-c9fl9\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.212754 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpjdh\" (UniqueName: \"kubernetes.io/projected/412c87f0-ca77-4ee2-bb68-efff9bb6c749-kube-api-access-gpjdh\") pod \"crc-debug-c9fl9\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.314493 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpjdh\" (UniqueName: \"kubernetes.io/projected/412c87f0-ca77-4ee2-bb68-efff9bb6c749-kube-api-access-gpjdh\") pod \"crc-debug-c9fl9\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.314622 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412c87f0-ca77-4ee2-bb68-efff9bb6c749-host\") pod \"crc-debug-c9fl9\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.314713 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412c87f0-ca77-4ee2-bb68-efff9bb6c749-host\") pod \"crc-debug-c9fl9\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.334822 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpjdh\" (UniqueName: \"kubernetes.io/projected/412c87f0-ca77-4ee2-bb68-efff9bb6c749-kube-api-access-gpjdh\") pod \"crc-debug-c9fl9\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.438181 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:11 crc kubenswrapper[4904]: W1205 21:55:11.481468 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412c87f0_ca77_4ee2_bb68_efff9bb6c749.slice/crio-1c7c0460bb1460924e62020a9b8599a54549d1e84ec1361bd10dc6ece58fc27f WatchSource:0}: Error finding container 1c7c0460bb1460924e62020a9b8599a54549d1e84ec1361bd10dc6ece58fc27f: Status 404 returned error can't find the container with id 1c7c0460bb1460924e62020a9b8599a54549d1e84ec1361bd10dc6ece58fc27f Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.697969 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d2e168-1a98-4a7d-8fe2-35ccc338658f" path="/var/lib/kubelet/pods/e8d2e168-1a98-4a7d-8fe2-35ccc338658f/volumes" Dec 05 21:55:11 crc kubenswrapper[4904]: I1205 21:55:11.701538 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" event={"ID":"412c87f0-ca77-4ee2-bb68-efff9bb6c749","Type":"ContainerStarted","Data":"1c7c0460bb1460924e62020a9b8599a54549d1e84ec1361bd10dc6ece58fc27f"} Dec 05 21:55:12 crc kubenswrapper[4904]: I1205 21:55:12.712698 4904 generic.go:334] "Generic (PLEG): container finished" podID="412c87f0-ca77-4ee2-bb68-efff9bb6c749" containerID="fd0bba27d1d1f13db9028dc5e3a91c66e0343695c595354b5627132bdfed5f9b" exitCode=0 Dec 05 21:55:12 crc kubenswrapper[4904]: I1205 21:55:12.712743 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" event={"ID":"412c87f0-ca77-4ee2-bb68-efff9bb6c749","Type":"ContainerDied","Data":"fd0bba27d1d1f13db9028dc5e3a91c66e0343695c595354b5627132bdfed5f9b"} Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.834018 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.874210 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpjdh\" (UniqueName: \"kubernetes.io/projected/412c87f0-ca77-4ee2-bb68-efff9bb6c749-kube-api-access-gpjdh\") pod \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.897123 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412c87f0-ca77-4ee2-bb68-efff9bb6c749-kube-api-access-gpjdh" (OuterVolumeSpecName: "kube-api-access-gpjdh") pod "412c87f0-ca77-4ee2-bb68-efff9bb6c749" (UID: "412c87f0-ca77-4ee2-bb68-efff9bb6c749"). InnerVolumeSpecName "kube-api-access-gpjdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.975625 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412c87f0-ca77-4ee2-bb68-efff9bb6c749-host\") pod \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\" (UID: \"412c87f0-ca77-4ee2-bb68-efff9bb6c749\") " Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.975751 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/412c87f0-ca77-4ee2-bb68-efff9bb6c749-host" (OuterVolumeSpecName: "host") pod "412c87f0-ca77-4ee2-bb68-efff9bb6c749" (UID: "412c87f0-ca77-4ee2-bb68-efff9bb6c749"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.976095 4904 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412c87f0-ca77-4ee2-bb68-efff9bb6c749-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:55:13 crc kubenswrapper[4904]: I1205 21:55:13.976113 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpjdh\" (UniqueName: \"kubernetes.io/projected/412c87f0-ca77-4ee2-bb68-efff9bb6c749-kube-api-access-gpjdh\") on node \"crc\" DevicePath \"\"" Dec 05 21:55:14 crc kubenswrapper[4904]: I1205 21:55:14.730752 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" event={"ID":"412c87f0-ca77-4ee2-bb68-efff9bb6c749","Type":"ContainerDied","Data":"1c7c0460bb1460924e62020a9b8599a54549d1e84ec1361bd10dc6ece58fc27f"} Dec 05 21:55:14 crc kubenswrapper[4904]: I1205 21:55:14.731051 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c7c0460bb1460924e62020a9b8599a54549d1e84ec1361bd10dc6ece58fc27f" Dec 05 21:55:14 crc kubenswrapper[4904]: I1205 21:55:14.730796 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-c9fl9" Dec 05 21:55:14 crc kubenswrapper[4904]: I1205 21:55:14.870950 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72sr6/crc-debug-c9fl9"] Dec 05 21:55:14 crc kubenswrapper[4904]: I1205 21:55:14.881430 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72sr6/crc-debug-c9fl9"] Dec 05 21:55:15 crc kubenswrapper[4904]: I1205 21:55:15.691887 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412c87f0-ca77-4ee2-bb68-efff9bb6c749" path="/var/lib/kubelet/pods/412c87f0-ca77-4ee2-bb68-efff9bb6c749/volumes" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.090695 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72sr6/crc-debug-9mqw7"] Dec 05 21:55:16 crc kubenswrapper[4904]: E1205 21:55:16.091114 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412c87f0-ca77-4ee2-bb68-efff9bb6c749" containerName="container-00" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.091126 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="412c87f0-ca77-4ee2-bb68-efff9bb6c749" containerName="container-00" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.091349 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="412c87f0-ca77-4ee2-bb68-efff9bb6c749" containerName="container-00" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.091999 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.094252 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-72sr6"/"default-dockercfg-6wvgf" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.218132 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-host\") pod \"crc-debug-9mqw7\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.218403 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gmc\" (UniqueName: \"kubernetes.io/projected/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-kube-api-access-g5gmc\") pod \"crc-debug-9mqw7\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.320789 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gmc\" (UniqueName: \"kubernetes.io/projected/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-kube-api-access-g5gmc\") pod \"crc-debug-9mqw7\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.320851 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-host\") pod \"crc-debug-9mqw7\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.320981 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-host\") pod \"crc-debug-9mqw7\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.352189 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gmc\" (UniqueName: \"kubernetes.io/projected/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-kube-api-access-g5gmc\") pod \"crc-debug-9mqw7\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.408602 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.682445 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:55:16 crc kubenswrapper[4904]: E1205 21:55:16.682748 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.748881 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" event={"ID":"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5","Type":"ContainerStarted","Data":"dbaa76aa5337ec794fdc01d0702815fa2c9959ef3b1fb4aa0b11ce0988d39bfa"} Dec 05 21:55:16 crc kubenswrapper[4904]: I1205 21:55:16.749288 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" event={"ID":"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5","Type":"ContainerStarted","Data":"9e008377ea283a2cd8b561f866ba0d60288f603f2f0ca60f0ebf91e4f9da9dd6"} Dec 05 21:55:17 crc kubenswrapper[4904]: I1205 21:55:17.765269 4904 generic.go:334] "Generic (PLEG): container finished" podID="4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" containerID="dbaa76aa5337ec794fdc01d0702815fa2c9959ef3b1fb4aa0b11ce0988d39bfa" exitCode=0 Dec 05 21:55:17 crc kubenswrapper[4904]: I1205 21:55:17.765326 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" event={"ID":"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5","Type":"ContainerDied","Data":"dbaa76aa5337ec794fdc01d0702815fa2c9959ef3b1fb4aa0b11ce0988d39bfa"} Dec 05 21:55:17 crc kubenswrapper[4904]: I1205 21:55:17.803788 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72sr6/crc-debug-9mqw7"] Dec 05 21:55:17 crc kubenswrapper[4904]: I1205 21:55:17.815275 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72sr6/crc-debug-9mqw7"] Dec 05 21:55:18 crc kubenswrapper[4904]: I1205 21:55:18.886339 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.083095 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5gmc\" (UniqueName: \"kubernetes.io/projected/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-kube-api-access-g5gmc\") pod \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.083329 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-host\") pod \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\" (UID: \"4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5\") " Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.083896 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-host" (OuterVolumeSpecName: "host") pod "4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" (UID: "4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.103197 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-kube-api-access-g5gmc" (OuterVolumeSpecName: "kube-api-access-g5gmc") pod "4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" (UID: "4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5"). InnerVolumeSpecName "kube-api-access-g5gmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.185325 4904 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.185571 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5gmc\" (UniqueName: \"kubernetes.io/projected/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5-kube-api-access-g5gmc\") on node \"crc\" DevicePath \"\"" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.706437 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" path="/var/lib/kubelet/pods/4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5/volumes" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.796207 4904 scope.go:117] "RemoveContainer" containerID="dbaa76aa5337ec794fdc01d0702815fa2c9959ef3b1fb4aa0b11ce0988d39bfa" Dec 05 21:55:19 crc kubenswrapper[4904]: I1205 21:55:19.796298 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/crc-debug-9mqw7" Dec 05 21:55:28 crc kubenswrapper[4904]: I1205 21:55:28.682284 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:55:28 crc kubenswrapper[4904]: E1205 21:55:28.683122 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:55:41 crc kubenswrapper[4904]: I1205 21:55:41.688602 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:55:41 crc kubenswrapper[4904]: E1205 21:55:41.689229 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:55:56 crc kubenswrapper[4904]: I1205 21:55:56.681925 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:55:56 crc kubenswrapper[4904]: E1205 21:55:56.682656 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:55:57 crc kubenswrapper[4904]: I1205 21:55:57.771950 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66d5cfdcdd-d59d6_f786b88c-ea37-4a02-bdd1-1f9feca9993a/barbican-api/0.log" Dec 05 21:55:57 crc kubenswrapper[4904]: I1205 21:55:57.983167 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66d5cfdcdd-d59d6_f786b88c-ea37-4a02-bdd1-1f9feca9993a/barbican-api-log/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.091311 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8fb8b568-c7bl2_83ab1170-b41e-4a13-b227-8b67b86587cc/barbican-keystone-listener-log/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.094688 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8fb8b568-c7bl2_83ab1170-b41e-4a13-b227-8b67b86587cc/barbican-keystone-listener/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.213876 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d8548b4f5-cmq7k_7af9740e-f05b-4d6d-9075-b7038018de84/barbican-worker/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.313163 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d8548b4f5-cmq7k_7af9740e-f05b-4d6d-9075-b7038018de84/barbican-worker-log/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.434253 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4xhb8_2277bc86-3475-44fd-a77d-9a2f552bb457/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.639491 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/ceilometer-notification-agent/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.643926 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/ceilometer-central-agent/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.906132 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/proxy-httpd/0.log" Dec 05 21:55:58 crc kubenswrapper[4904]: I1205 21:55:58.947915 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fc4fd93-36e9-448b-88ec-b4c4227c941c/sg-core/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.147575 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dadf2169-1b54-4e50-adff-71504b526259/cinder-api-log/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.370533 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cdca5ac3-4ef8-43e3-8244-438e43e029c4/probe/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.628444 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cdca5ac3-4ef8-43e3-8244-438e43e029c4/cinder-backup/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.654582 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60174921-7963-4d63-83e7-8b702d3e9dd2/cinder-scheduler/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.722988 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dadf2169-1b54-4e50-adff-71504b526259/cinder-api/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.817703 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60174921-7963-4d63-83e7-8b702d3e9dd2/probe/0.log" Dec 05 21:55:59 crc kubenswrapper[4904]: I1205 21:55:59.907437 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b2237f58-91be-4eae-9feb-94feacffd4a6/probe/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.098644 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b2237f58-91be-4eae-9feb-94feacffd4a6/cinder-volume/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.153937 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_b3d4e158-45e3-4448-ad27-36e4aa3cb002/probe/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.313709 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_b3d4e158-45e3-4448-ad27-36e4aa3cb002/cinder-volume/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.354073 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-jvqvx_676e3b5b-34d1-47bc-a1db-3bb15a83282b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.606868 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ddmgb_d1b27fa3-93b0-4c28-9158-8de58adc4799/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.610332 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b7c6b4c7-wk76m_0aa075ff-9799-456e-b08e-5146d8e11c06/init/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.875022 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-s8jb4_2b8cd007-58c8-4fd2-924d-a8d24608ff6c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.877838 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b7c6b4c7-wk76m_0aa075ff-9799-456e-b08e-5146d8e11c06/init/0.log" Dec 05 21:56:00 crc kubenswrapper[4904]: I1205 21:56:00.954223 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b7c6b4c7-wk76m_0aa075ff-9799-456e-b08e-5146d8e11c06/dnsmasq-dns/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.099639 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e/glance-log/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.115150 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fb80e7c5-d60b-408a-9c7a-78b6e47b4d7e/glance-httpd/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.330096 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1b2cda4-e597-4027-b9b1-cf52ec98dcb8/glance-httpd/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.385102 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1b2cda4-e597-4027-b9b1-cf52ec98dcb8/glance-log/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.487845 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d867d46cb-9mdx2_ffe3f1f4-8d49-4bf8-a088-e3a930ddc614/horizon/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.700328 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-28gf8_9edc6269-cd10-4724-9cf3-9b65c80ab8d9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:01 crc kubenswrapper[4904]: I1205 21:56:01.772637 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ww8dd_1b6a0ad9-b447-49b5-b42a-6c1d6f5424f9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:02 crc kubenswrapper[4904]: I1205 21:56:02.005463 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416141-5f2p7_c392ddc8-695b-4543-a7f4-05ad75ff272b/keystone-cron/0.log" Dec 05 21:56:02 crc kubenswrapper[4904]: I1205 21:56:02.253888 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d867d46cb-9mdx2_ffe3f1f4-8d49-4bf8-a088-e3a930ddc614/horizon-log/0.log" Dec 05 21:56:02 crc kubenswrapper[4904]: I1205 21:56:02.404999 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cd4e564c-e066-405d-92e5-1d312bfd1f57/kube-state-metrics/0.log" Dec 05 21:56:02 crc kubenswrapper[4904]: I1205 21:56:02.456205 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-64fdbdb744-gdc8l_2f6ea7b0-d47e-4b91-bca9-7816ddfbc19a/keystone-api/0.log" Dec 05 21:56:02 crc kubenswrapper[4904]: I1205 21:56:02.532264 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5548g_11479a0b-d4c6-4770-bdf1-dbb8a417384d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:02 crc kubenswrapper[4904]: I1205 21:56:02.926858 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vzhmr_aee65db4-60e1-4a83-80d0-81e90f6f4f07/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:03 crc kubenswrapper[4904]: I1205 21:56:03.086455 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78cf6bb7c7-zmmjl_9f1be3e8-cbd0-45cb-acb6-cf56217cec07/neutron-httpd/0.log" Dec 05 21:56:03 crc kubenswrapper[4904]: I1205 21:56:03.112691 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78cf6bb7c7-zmmjl_9f1be3e8-cbd0-45cb-acb6-cf56217cec07/neutron-api/0.log" Dec 05 21:56:03 crc kubenswrapper[4904]: I1205 21:56:03.870483 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8538bb66-3a1f-40a2-bf64-b04b49318d34/nova-cell0-conductor-conductor/0.log" Dec 05 21:56:04 crc kubenswrapper[4904]: I1205 21:56:04.095282 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a3f6c470-722e-4a05-a450-e65791498b79/nova-cell1-conductor-conductor/0.log" Dec 05 21:56:04 crc kubenswrapper[4904]: I1205 21:56:04.522250 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8da8babf-f15a-4f70-bf70-23218ca0628a/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 21:56:04 crc kubenswrapper[4904]: I1205 21:56:04.668562 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-6jljb_d9ee9825-e991-495c-bbd8-30ae1e7b0780/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:04 crc kubenswrapper[4904]: I1205 21:56:04.711069 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0/nova-api-log/0.log" Dec 05 21:56:04 crc kubenswrapper[4904]: I1205 21:56:04.960558 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eeb7ce1d-7b77-4a2f-9e87-99569f95995d/nova-metadata-log/0.log" Dec 05 21:56:05 crc kubenswrapper[4904]: I1205 21:56:05.128320 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_be6a8cc9-1fdc-4dea-a3b3-e0a13cd882b0/nova-api-api/0.log" Dec 05 21:56:05 crc kubenswrapper[4904]: I1205 21:56:05.358019 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3437a493-1ffe-49dc-a789-3451b2f87204/mysql-bootstrap/0.log" Dec 05 21:56:05 crc kubenswrapper[4904]: I1205 21:56:05.731153 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5ce87dae-66a4-4d60-bd13-3ac6a44abeed/nova-scheduler-scheduler/0.log" Dec 05 21:56:05 crc kubenswrapper[4904]: I1205 21:56:05.735286 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3437a493-1ffe-49dc-a789-3451b2f87204/mysql-bootstrap/0.log" Dec 05 21:56:05 crc kubenswrapper[4904]: I1205 21:56:05.778020 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3437a493-1ffe-49dc-a789-3451b2f87204/galera/0.log" Dec 05 21:56:05 crc kubenswrapper[4904]: I1205 21:56:05.945085 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e08506d4-1ca7-4932-b73f-21020cb20578/mysql-bootstrap/0.log" Dec 05 21:56:06 crc kubenswrapper[4904]: I1205 21:56:06.104841 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e08506d4-1ca7-4932-b73f-21020cb20578/mysql-bootstrap/0.log" Dec 05 21:56:06 crc kubenswrapper[4904]: I1205 21:56:06.207429 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e08506d4-1ca7-4932-b73f-21020cb20578/galera/0.log" Dec 05 21:56:06 crc kubenswrapper[4904]: I1205 21:56:06.323134 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c8a50daf-6f3c-405d-b047-12a11ac0b56b/openstackclient/0.log" Dec 05 21:56:06 crc kubenswrapper[4904]: I1205 21:56:06.452309 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hlxgq_41ef8df6-e0e1-45a5-954d-10ce99fa26de/ovn-controller/0.log" Dec 05 21:56:06 crc kubenswrapper[4904]: I1205 21:56:06.699123 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-t974s_e7529566-97c3-42bf-a66c-1186aec23176/openstack-network-exporter/0.log" Dec 05 21:56:06 crc kubenswrapper[4904]: I1205 21:56:06.796046 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovsdb-server-init/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.016692 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovsdb-server-init/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.071821 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovsdb-server/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.335704 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4qvrt_e05ac5cc-a4c0-46c3-9beb-3f607156b962/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.487551 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h7djt_3db65f91-e650-49d5-b372-cabc44efff3f/ovs-vswitchd/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.573533 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c15a10eb-9132-4ec6-8861-4c2320962cc3/openstack-network-exporter/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.659864 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eeb7ce1d-7b77-4a2f-9e87-99569f95995d/nova-metadata-metadata/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.681237 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:56:07 crc kubenswrapper[4904]: E1205 21:56:07.681615 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.689642 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c15a10eb-9132-4ec6-8861-4c2320962cc3/ovn-northd/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.844618 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0/openstack-network-exporter/0.log" Dec 05 21:56:07 crc kubenswrapper[4904]: I1205 21:56:07.894608 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c47c71a-aff6-4444-ba2a-f3ae27d6cbe0/ovsdbserver-nb/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.111417 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c9f105e3-0b5a-435f-bc00-fcfd7eceaafd/openstack-network-exporter/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.140418 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c9f105e3-0b5a-435f-bc00-fcfd7eceaafd/ovsdbserver-sb/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.399782 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/init-config-reloader/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.484743 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c6d7bc49b-2f688_4879cccb-c711-458b-8ae5-895ec70f6536/placement-api/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.608132 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c6d7bc49b-2f688_4879cccb-c711-458b-8ae5-895ec70f6536/placement-log/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.670236 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/init-config-reloader/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.682589 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/config-reloader/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.695306 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/prometheus/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.823791 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3c7e88aa-9ffe-40cc-a50e-009baf1cc7a0/thanos-sidecar/0.log" Dec 05 21:56:08 crc kubenswrapper[4904]: I1205 21:56:08.885500 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_69046049-50b5-4ced-8afa-5ef3405aad24/setup-container/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.302965 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_69046049-50b5-4ced-8afa-5ef3405aad24/rabbitmq/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.337388 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_69046049-50b5-4ced-8afa-5ef3405aad24/setup-container/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.456487 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_0caaad94-d02e-43da-bf3b-087a5ec8d2f8/setup-container/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.656106 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_0caaad94-d02e-43da-bf3b-087a5ec8d2f8/setup-container/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.717549 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_0caaad94-d02e-43da-bf3b-087a5ec8d2f8/rabbitmq/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.751764 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1e45678d-877d-4c34-a8f5-913d31a8b79d/setup-container/0.log" Dec 05 21:56:09 crc kubenswrapper[4904]: I1205 21:56:09.932767 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1e45678d-877d-4c34-a8f5-913d31a8b79d/setup-container/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.020515 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6tp6w_222edc46-4ddd-4236-8635-45b365513214/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.045839 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1e45678d-877d-4c34-a8f5-913d31a8b79d/rabbitmq/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.274958 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qlps8_c12b173c-79c1-4bcf-a76a-b3bc84b9b556/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.287626 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tc59d_cc55dc7a-28f4-47eb-8c57-58e949a98dcc/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.569833 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7dbn4_88dfb504-1f6c-41bc-860e-84eeb0a7fff9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.588745 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-g44s6_78aea92b-9deb-44c8-b5ac-f9224b038591/ssh-known-hosts-edpm-deployment/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.879231 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc85d8c5-xr857_26484d4c-3765-4214-81e6-af49ebfde502/proxy-server/0.log" Dec 05 21:56:10 crc kubenswrapper[4904]: I1205 21:56:10.994221 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-9xq46_4b80a797-4212-4242-81fd-928045b629cd/swift-ring-rebalance/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.062536 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc85d8c5-xr857_26484d4c-3765-4214-81e6-af49ebfde502/proxy-httpd/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.131712 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-auditor/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.309458 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-reaper/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.330639 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-auditor/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.338867 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-replicator/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.353459 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/account-server/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.520279 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-updater/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.560951 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-replicator/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.579484 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/container-server/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.633215 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-auditor/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.781395 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-server/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.831174 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-expirer/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.835075 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-replicator/0.log" Dec 05 21:56:11 crc kubenswrapper[4904]: I1205 21:56:11.856316 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/object-updater/0.log" Dec 05 21:56:12 crc kubenswrapper[4904]: I1205 21:56:12.051491 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/rsync/0.log" Dec 05 21:56:12 crc kubenswrapper[4904]: I1205 21:56:12.121164 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be5ed3b2-bc48-4865-ade5-f7c2e379a1ea/swift-recon-cron/0.log" Dec 05 21:56:12 crc kubenswrapper[4904]: I1205 21:56:12.145677 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jlk8k_adfcfa36-4dfd-422c-b1a5-3a2e342ea208/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:12 crc kubenswrapper[4904]: I1205 21:56:12.364687 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_45f624ec-9d5e-41f1-ba5b-e81c2b84c532/tempest-tests-tempest-tests-runner/0.log" Dec 05 21:56:12 crc kubenswrapper[4904]: I1205 21:56:12.395261 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_31c70021-e6df-4ac6-b79e-402f24a13112/test-operator-logs-container/0.log" Dec 05 21:56:12 crc kubenswrapper[4904]: I1205 21:56:12.637783 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-qg8xq_192437e7-c4fd-4142-94fb-e3f2a9c75841/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:56:13 crc kubenswrapper[4904]: I1205 21:56:13.266185 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_48f600c5-7f88-4c27-a152-d79354212532/watcher-applier/0.log" Dec 05 21:56:13 crc kubenswrapper[4904]: I1205 21:56:13.987346 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_f7be275a-3638-4279-a90b-8ad43e931ee6/watcher-api-log/0.log" Dec 05 21:56:17 crc kubenswrapper[4904]: I1205 21:56:17.102604 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_e610af6c-e57f-4676-95ec-b8accd64aea9/watcher-decision-engine/0.log" Dec 05 21:56:18 crc kubenswrapper[4904]: I1205 21:56:18.214535 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_f7be275a-3638-4279-a90b-8ad43e931ee6/watcher-api/0.log" Dec 05 21:56:18 crc kubenswrapper[4904]: I1205 21:56:18.682162 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:56:18 crc kubenswrapper[4904]: E1205 21:56:18.682418 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:56:19 crc kubenswrapper[4904]: I1205 21:56:19.007598 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_980ec67d-9dc9-4cae-8169-9890a40d65c3/memcached/0.log" Dec 05 21:56:29 crc kubenswrapper[4904]: I1205 21:56:29.682087 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:56:29 crc kubenswrapper[4904]: E1205 21:56:29.682786 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:56:41 crc kubenswrapper[4904]: I1205 21:56:41.691269 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:56:41 crc kubenswrapper[4904]: E1205 21:56:41.693139 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.284219 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jrrnh_16525ebe-6de2-4974-a4ee-ed99a0e4ea1f/kube-rbac-proxy/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.426365 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jrrnh_16525ebe-6de2-4974-a4ee-ed99a0e4ea1f/manager/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.517617 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/util/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.727862 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/pull/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.730577 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/util/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.746266 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/pull/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.923517 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/util/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.933686 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/extract/0.log" Dec 05 21:56:42 crc kubenswrapper[4904]: I1205 21:56:42.942853 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a6678k27_6cab1a63-a096-409d-9ca0-3308b4d6b434/pull/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.097768 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xd4c5_0fefb560-28a2-4316-9448-8361111d4837/kube-rbac-proxy/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.152353 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xd4c5_0fefb560-28a2-4316-9448-8361111d4837/manager/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.224714 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7f864_8ee93c89-5d32-4114-b134-c084359d11ec/kube-rbac-proxy/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.297118 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7f864_8ee93c89-5d32-4114-b134-c084359d11ec/manager/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.368491 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-z4chx_799e9803-03c5-4406-8b50-e59dedc0918d/kube-rbac-proxy/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.470983 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-z4chx_799e9803-03c5-4406-8b50-e59dedc0918d/manager/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.580930 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lvn6q_17322e61-a1f2-4228-8784-ea6869288aaa/manager/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.609849 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lvn6q_17322e61-a1f2-4228-8784-ea6869288aaa/kube-rbac-proxy/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.700043 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lc48g_7ae1ca6f-894c-4c12-ab0a-459e97fa442e/kube-rbac-proxy/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.834586 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lc48g_7ae1ca6f-894c-4c12-ab0a-459e97fa442e/manager/0.log" Dec 05 21:56:43 crc kubenswrapper[4904]: I1205 21:56:43.970926 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4zb86_4453c00a-291d-4edb-ab56-9b0fdf3b1ea5/kube-rbac-proxy/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.128567 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4zb86_4453c00a-291d-4edb-ab56-9b0fdf3b1ea5/manager/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.178520 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-9zhph_60eb443e-807c-41f8-8935-7bfacc9dc89b/kube-rbac-proxy/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.384953 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-9zhph_60eb443e-807c-41f8-8935-7bfacc9dc89b/manager/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.466637 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-h9wsr_a2b7747b-01b7-4f12-8748-2661f53078f0/kube-rbac-proxy/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.591097 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-h9wsr_a2b7747b-01b7-4f12-8748-2661f53078f0/manager/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.712673 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xt4ph_e7d93aaa-3c37-4793-8535-6dcce7bb79b0/kube-rbac-proxy/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.722555 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xt4ph_e7d93aaa-3c37-4793-8535-6dcce7bb79b0/manager/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.837544 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-wq6hj_8c31d222-d349-476a-9aa2-cc57ec51d926/kube-rbac-proxy/0.log" Dec 05 21:56:44 crc kubenswrapper[4904]: I1205 21:56:44.932748 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-wq6hj_8c31d222-d349-476a-9aa2-cc57ec51d926/manager/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.036854 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-bzms8_7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1/kube-rbac-proxy/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.089019 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-bzms8_7cbe4ebd-943a-4ffe-8cf1-5ade3e005ab1/manager/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.171461 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xv5gb_2035ff23-1caf-4e9e-bc39-46caef0eb07d/kube-rbac-proxy/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.343916 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xv5gb_2035ff23-1caf-4e9e-bc39-46caef0eb07d/manager/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.428438 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l65z8_03744ad0-4115-4d2f-bf2d-5acc45a6d05a/manager/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.442180 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l65z8_03744ad0-4115-4d2f-bf2d-5acc45a6d05a/kube-rbac-proxy/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.702226 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9_50c0879e-d30b-4a8a-972b-10f3188ff06a/kube-rbac-proxy/0.log" Dec 05 21:56:45 crc kubenswrapper[4904]: I1205 21:56:45.744460 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd49rzr9_50c0879e-d30b-4a8a-972b-10f3188ff06a/manager/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.162025 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56699b584c-242nn_3c454cc5-18c2-420a-ac01-657bedda4fa7/operator/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.211773 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kc5m7_6b8dcb6d-00ed-4247-b4c8-5e5964bc0513/registry-server/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.219088 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-f2bbb_116b1af4-b71b-46ab-9977-12342c13594e/kube-rbac-proxy/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.454172 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-f2bbb_116b1af4-b71b-46ab-9977-12342c13594e/manager/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.465361 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gv6jv_53b1878e-6de9-4961-9cf6-4673c09c0412/manager/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.466803 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gv6jv_53b1878e-6de9-4961-9cf6-4673c09c0412/kube-rbac-proxy/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.686545 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-xqdzf_2f0e1e69-f795-4166-b6bd-946050c1524e/kube-rbac-proxy/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.755433 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4ddwg_51c461fe-c535-4a35-8409-f319c4549ecf/operator/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.952824 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-n6d4m_086dd7ac-a5cc-4433-a5a6-0cfd88a69d72/kube-rbac-proxy/0.log" Dec 05 21:56:46 crc kubenswrapper[4904]: I1205 21:56:46.970137 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-xqdzf_2f0e1e69-f795-4166-b6bd-946050c1524e/manager/0.log" Dec 05 21:56:47 crc kubenswrapper[4904]: I1205 21:56:47.175942 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-kr7mw_54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9/kube-rbac-proxy/0.log" Dec 05 21:56:47 crc kubenswrapper[4904]: I1205 21:56:47.224399 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-n6d4m_086dd7ac-a5cc-4433-a5a6-0cfd88a69d72/manager/0.log" Dec 05 21:56:47 crc kubenswrapper[4904]: I1205 21:56:47.278172 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dc867b75-jxjt2_0964d0c8-bed0-4c26-969d-c8e895793312/manager/0.log" Dec 05 21:56:47 crc kubenswrapper[4904]: I1205 21:56:47.279215 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-kr7mw_54f8b0f1-58e6-445e-a7ea-ba4eb6e298a9/manager/0.log" Dec 05 21:56:47 crc kubenswrapper[4904]: I1205 21:56:47.414330 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f6cb9b975-pjwn4_e0155f33-dd5a-4c6f-b261-2b7026149e9c/kube-rbac-proxy/0.log" Dec 05 21:56:47 crc kubenswrapper[4904]: I1205 21:56:47.556635 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f6cb9b975-pjwn4_e0155f33-dd5a-4c6f-b261-2b7026149e9c/manager/0.log" Dec 05 21:56:55 crc kubenswrapper[4904]: I1205 21:56:55.682919 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:56:55 crc kubenswrapper[4904]: E1205 21:56:55.683910 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 21:57:06 crc kubenswrapper[4904]: I1205 21:57:06.681197 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 21:57:06 crc kubenswrapper[4904]: I1205 21:57:06.908923 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"d2e9673559b3b94a8cab133eb4edbf277b9d48ecf8c2bd1511c1604c1ad703c4"} Dec 05 21:57:07 crc kubenswrapper[4904]: I1205 21:57:07.480446 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-888hj_efb530a0-f68c-4664-81bf-32871d3b8259/control-plane-machine-set-operator/0.log" Dec 05 21:57:07 crc kubenswrapper[4904]: I1205 21:57:07.627100 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l928g_11ca5edb-7664-4e63-a9e8-46f270623ad2/machine-api-operator/0.log" Dec 05 21:57:07 crc kubenswrapper[4904]: I1205 21:57:07.631084 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l928g_11ca5edb-7664-4e63-a9e8-46f270623ad2/kube-rbac-proxy/0.log" Dec 05 21:57:21 crc kubenswrapper[4904]: I1205 21:57:21.773358 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-5gj92_c33ef1b8-c426-4845-ace3-476e7e7c842e/cert-manager-controller/0.log" Dec 05 21:57:21 crc kubenswrapper[4904]: I1205 21:57:21.900500 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-tfxc5_45ace5ad-d0b7-469f-b03c-62e935ba67dd/cert-manager-cainjector/0.log" Dec 05 21:57:21 crc kubenswrapper[4904]: I1205 21:57:21.978713 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2d8zg_cdc9c672-2a99-47b6-8457-fd6ec79db49b/cert-manager-webhook/0.log" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.599963 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sskrd"] Dec 05 21:57:25 crc kubenswrapper[4904]: E1205 21:57:25.600970 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" containerName="container-00" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.600983 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" containerName="container-00" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.601194 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba2355d-70dc-4868-b4e1-ad0bafcbdcc5" containerName="container-00" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.603044 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.615555 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sskrd"] Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.674712 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-catalog-content\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.674752 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4ws\" (UniqueName: \"kubernetes.io/projected/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-kube-api-access-4d4ws\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.674856 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-utilities\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.776586 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-catalog-content\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.776660 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4ws\" (UniqueName: \"kubernetes.io/projected/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-kube-api-access-4d4ws\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.776775 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-utilities\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.777040 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-catalog-content\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.777074 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-utilities\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.795852 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4ws\" (UniqueName: \"kubernetes.io/projected/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-kube-api-access-4d4ws\") pod \"certified-operators-sskrd\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:25 crc kubenswrapper[4904]: I1205 21:57:25.981601 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:26 crc kubenswrapper[4904]: I1205 21:57:26.553244 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sskrd"] Dec 05 21:57:27 crc kubenswrapper[4904]: I1205 21:57:27.134381 4904 generic.go:334] "Generic (PLEG): container finished" podID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerID="8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7" exitCode=0 Dec 05 21:57:27 crc kubenswrapper[4904]: I1205 21:57:27.134489 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerDied","Data":"8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7"} Dec 05 21:57:27 crc kubenswrapper[4904]: I1205 21:57:27.134771 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerStarted","Data":"f5846593492413417ef206890068af8259bbeade4901031eef2528cebe372a69"} Dec 05 21:57:27 crc kubenswrapper[4904]: I1205 21:57:27.137992 4904 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:57:28 crc kubenswrapper[4904]: I1205 21:57:28.161467 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerStarted","Data":"6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe"} Dec 05 21:57:29 crc kubenswrapper[4904]: I1205 21:57:29.173704 4904 generic.go:334] "Generic (PLEG): container finished" podID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerID="6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe" exitCode=0 Dec 05 21:57:29 crc kubenswrapper[4904]: I1205 21:57:29.173764 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerDied","Data":"6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe"} Dec 05 21:57:30 crc kubenswrapper[4904]: I1205 21:57:30.186848 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerStarted","Data":"38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58"} Dec 05 21:57:30 crc kubenswrapper[4904]: I1205 21:57:30.218822 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sskrd" podStartSLOduration=2.764777403 podStartE2EDuration="5.218768836s" podCreationTimestamp="2025-12-05 21:57:25 +0000 UTC" firstStartedPulling="2025-12-05 21:57:27.137624911 +0000 UTC m=+6345.948841040" lastFinishedPulling="2025-12-05 21:57:29.591616344 +0000 UTC m=+6348.402832473" observedRunningTime="2025-12-05 21:57:30.210719538 +0000 UTC m=+6349.021935657" watchObservedRunningTime="2025-12-05 21:57:30.218768836 +0000 UTC m=+6349.029984955" Dec 05 21:57:35 crc kubenswrapper[4904]: I1205 21:57:35.803464 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-lqpqd_ffe1cf37-52b4-4493-bf5b-f0318a5015a9/nmstate-console-plugin/0.log" Dec 05 21:57:35 crc kubenswrapper[4904]: I1205 21:57:35.981656 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:35 crc kubenswrapper[4904]: I1205 21:57:35.981974 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.084786 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.113040 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-94lfj_b05e6bce-1fed-411b-9c7d-ea32260cb8dc/nmstate-metrics/0.log" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.250623 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-xg947_a66ea824-d482-41b2-8ddc-5ee70d24db5a/nmstate-operator/0.log" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.313269 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.359113 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sskrd"] Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.463331 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7dl4s_1f407364-e4d6-4506-abaa-f4e3ae5ab29f/nmstate-webhook/0.log" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.479677 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8qf2w_b9bbd6ea-7829-4dfe-b4e0-1dcda3fe0903/nmstate-handler/0.log" Dec 05 21:57:36 crc kubenswrapper[4904]: I1205 21:57:36.487652 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-94lfj_b05e6bce-1fed-411b-9c7d-ea32260cb8dc/kube-rbac-proxy/0.log" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.263384 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sskrd" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="registry-server" containerID="cri-o://38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58" gracePeriod=2 Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.720194 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.747215 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-utilities\") pod \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.747390 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-catalog-content\") pod \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.747447 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4ws\" (UniqueName: \"kubernetes.io/projected/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-kube-api-access-4d4ws\") pod \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\" (UID: \"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe\") " Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.749544 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-utilities" (OuterVolumeSpecName: "utilities") pod "1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" (UID: "1ef5b5eb-ff66-493c-a72e-d52a3b463dfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.762055 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-kube-api-access-4d4ws" (OuterVolumeSpecName: "kube-api-access-4d4ws") pod "1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" (UID: "1ef5b5eb-ff66-493c-a72e-d52a3b463dfe"). InnerVolumeSpecName "kube-api-access-4d4ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.812966 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" (UID: "1ef5b5eb-ff66-493c-a72e-d52a3b463dfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.851204 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.851241 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:57:38 crc kubenswrapper[4904]: I1205 21:57:38.851255 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4ws\" (UniqueName: \"kubernetes.io/projected/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe-kube-api-access-4d4ws\") on node \"crc\" DevicePath \"\"" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.307019 4904 generic.go:334] "Generic (PLEG): container finished" podID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerID="38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58" exitCode=0 Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.307175 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sskrd" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.307217 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerDied","Data":"38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58"} Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.307522 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sskrd" event={"ID":"1ef5b5eb-ff66-493c-a72e-d52a3b463dfe","Type":"ContainerDied","Data":"f5846593492413417ef206890068af8259bbeade4901031eef2528cebe372a69"} Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.307549 4904 scope.go:117] "RemoveContainer" containerID="38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.331270 4904 scope.go:117] "RemoveContainer" containerID="6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.361679 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sskrd"] Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.375136 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sskrd"] Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.379217 4904 scope.go:117] "RemoveContainer" containerID="8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.422104 4904 scope.go:117] "RemoveContainer" containerID="38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58" Dec 05 21:57:39 crc kubenswrapper[4904]: E1205 21:57:39.423113 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58\": container with ID starting with 38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58 not found: ID does not exist" containerID="38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.423173 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58"} err="failed to get container status \"38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58\": rpc error: code = NotFound desc = could not find container \"38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58\": container with ID starting with 38738ac800613180f1f4bdd90c1913da9c735ba5e66275de47de1067fbf64b58 not found: ID does not exist" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.423204 4904 scope.go:117] "RemoveContainer" containerID="6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe" Dec 05 21:57:39 crc kubenswrapper[4904]: E1205 21:57:39.423626 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe\": container with ID starting with 6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe not found: ID does not exist" containerID="6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.423661 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe"} err="failed to get container status \"6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe\": rpc error: code = NotFound desc = could not find container \"6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe\": container with ID starting with 6961ae2c165b79dadd72c8d9aa1a22ebb6f46fc6ca93c632cbf86d27378b0dfe not found: ID does not exist" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.423684 4904 scope.go:117] "RemoveContainer" containerID="8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7" Dec 05 21:57:39 crc kubenswrapper[4904]: E1205 21:57:39.424032 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7\": container with ID starting with 8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7 not found: ID does not exist" containerID="8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.424099 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7"} err="failed to get container status \"8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7\": rpc error: code = NotFound desc = could not find container \"8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7\": container with ID starting with 8a60a16e7462833762ce46006343a2c06689821ce35d5cb356f99d9660776fa7 not found: ID does not exist" Dec 05 21:57:39 crc kubenswrapper[4904]: I1205 21:57:39.694931 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" path="/var/lib/kubelet/pods/1ef5b5eb-ff66-493c-a72e-d52a3b463dfe/volumes" Dec 05 21:57:51 crc kubenswrapper[4904]: I1205 21:57:51.738485 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v26v8_2125e0a4-9809-42b4-911f-08c6d2e74879/kube-rbac-proxy/0.log" Dec 05 21:57:51 crc kubenswrapper[4904]: I1205 21:57:51.833935 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v26v8_2125e0a4-9809-42b4-911f-08c6d2e74879/controller/0.log" Dec 05 21:57:51 crc kubenswrapper[4904]: I1205 21:57:51.943217 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.130831 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.151574 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.157344 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.195552 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.358097 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.395905 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.396037 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.413747 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.766941 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-frr-files/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.812364 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/controller/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.820129 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-reloader/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.828011 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/cp-metrics/0.log" Dec 05 21:57:52 crc kubenswrapper[4904]: I1205 21:57:52.996445 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/kube-rbac-proxy/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.020161 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/frr-metrics/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.070513 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/kube-rbac-proxy-frr/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.298415 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/reloader/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.301430 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-dpnqq_3b7fd88f-360a-4c89-8740-069bc371b65b/frr-k8s-webhook-server/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.547878 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58cc54b6b6-b77qk_54d3f8f4-a4d9-4f3c-b923-f64d6c4f1b28/manager/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.631454 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5595c7bb55-zxmp8_2230230c-8a27-44a4-a63a-219e0e40f288/webhook-server/0.log" Dec 05 21:57:53 crc kubenswrapper[4904]: I1205 21:57:53.754007 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mvkbj_080b347a-f590-47cf-909f-578330838c1d/kube-rbac-proxy/0.log" Dec 05 21:57:54 crc kubenswrapper[4904]: I1205 21:57:54.264753 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mvkbj_080b347a-f590-47cf-909f-578330838c1d/speaker/0.log" Dec 05 21:57:54 crc kubenswrapper[4904]: I1205 21:57:54.529237 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hz9g9_1f78e03d-db28-4ba2-830f-189c97051d36/frr/0.log" Dec 05 21:58:07 crc kubenswrapper[4904]: I1205 21:58:07.851108 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/util/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.034515 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/util/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.037171 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/pull/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.051839 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/pull/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.296657 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/extract/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.299264 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/pull/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.302362 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmmlzw_397a4321-b8b2-4041-9219-a5f837937346/util/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.469304 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/util/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.631499 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/util/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.669386 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/pull/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.696357 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/pull/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.827243 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/util/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.864222 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/pull/0.log" Dec 05 21:58:08 crc kubenswrapper[4904]: I1205 21:58:08.884479 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jm86d_5b90a283-7232-45a4-9326-e96a19d446fa/extract/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.045543 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/util/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.211760 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/util/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.223794 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/pull/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.228123 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/pull/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.360437 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/util/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.385756 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/extract/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.394021 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83khk26_074d2553-b276-41f9-ae52-d37209d033e3/pull/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.520988 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-utilities/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.681031 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-content/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.722703 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-content/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.725475 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-utilities/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.886637 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-utilities/0.log" Dec 05 21:58:09 crc kubenswrapper[4904]: I1205 21:58:09.905714 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/extract-content/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.054924 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-utilities/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.377772 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-utilities/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.389899 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-content/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.404778 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-content/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.618518 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qf2xk_4912f2a7-ae28-4ec6-a674-813c38c327c0/registry-server/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.702196 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-utilities/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.706929 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/extract-content/0.log" Dec 05 21:58:10 crc kubenswrapper[4904]: I1205 21:58:10.983075 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7wzh2_425f2b0f-3e5c-4db8-95f2-e0ae6581a443/marketplace-operator/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.184034 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-utilities/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.355415 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-utilities/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.415852 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-content/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.462826 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-content/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.502097 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhhfn_81188cfd-eef2-4e0b-b04a-fd189da456d2/registry-server/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.628641 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-content/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.647808 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/extract-utilities/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.781239 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-utilities/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.836506 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffsws_3124ecfc-d81f-468a-89e6-1b76cdf4e61e/registry-server/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.980078 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-utilities/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.980316 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-content/0.log" Dec 05 21:58:11 crc kubenswrapper[4904]: I1205 21:58:11.998634 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-content/0.log" Dec 05 21:58:12 crc kubenswrapper[4904]: I1205 21:58:12.180552 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-content/0.log" Dec 05 21:58:12 crc kubenswrapper[4904]: I1205 21:58:12.195942 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/extract-utilities/0.log" Dec 05 21:58:12 crc kubenswrapper[4904]: I1205 21:58:12.899204 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-957zc_f1995f9a-4194-4318-b9ab-b30b6c01ac51/registry-server/0.log" Dec 05 21:58:24 crc kubenswrapper[4904]: I1205 21:58:24.694869 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-52c6d_c290c96d-3acd-4374-91d7-20efcef53eda/prometheus-operator/0.log" Dec 05 21:58:24 crc kubenswrapper[4904]: I1205 21:58:24.879964 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bdfbd597-b7v44_a54452d0-1ba5-4b81-aab4-2e2f2293fa6b/prometheus-operator-admission-webhook/0.log" Dec 05 21:58:24 crc kubenswrapper[4904]: I1205 21:58:24.932483 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bdfbd597-qxbbm_6b774de0-c1b7-43c1-86b5-b444cc0275d4/prometheus-operator-admission-webhook/0.log" Dec 05 21:58:25 crc kubenswrapper[4904]: I1205 21:58:25.093624 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-vg7c2_14ec1958-2889-4fef-90ee-e73296264291/operator/0.log" Dec 05 21:58:25 crc kubenswrapper[4904]: I1205 21:58:25.114155 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-8pj8n_7d6cdfc6-4b45-4cc3-98c1-943ee5e7cf3b/perses-operator/0.log" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.868270 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7wvtk"] Dec 05 21:58:57 crc kubenswrapper[4904]: E1205 21:58:57.869572 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="extract-content" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.869595 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="extract-content" Dec 05 21:58:57 crc kubenswrapper[4904]: E1205 21:58:57.869625 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="registry-server" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.869662 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="registry-server" Dec 05 21:58:57 crc kubenswrapper[4904]: E1205 21:58:57.869712 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="extract-utilities" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.869725 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="extract-utilities" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.870148 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef5b5eb-ff66-493c-a72e-d52a3b463dfe" containerName="registry-server" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.872776 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.886749 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7wvtk"] Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.984754 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2c9j\" (UniqueName: \"kubernetes.io/projected/76ff09a6-b1f4-4794-b656-edfc2f50214a-kube-api-access-s2c9j\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.984961 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-utilities\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:57 crc kubenswrapper[4904]: I1205 21:58:57.985019 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-catalog-content\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.086872 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-utilities\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.086955 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-catalog-content\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.087088 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c9j\" (UniqueName: \"kubernetes.io/projected/76ff09a6-b1f4-4794-b656-edfc2f50214a-kube-api-access-s2c9j\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.087654 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-utilities\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.087734 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-catalog-content\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.112456 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c9j\" (UniqueName: \"kubernetes.io/projected/76ff09a6-b1f4-4794-b656-edfc2f50214a-kube-api-access-s2c9j\") pod \"community-operators-7wvtk\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.191492 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:58:58 crc kubenswrapper[4904]: I1205 21:58:58.805089 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7wvtk"] Dec 05 21:58:58 crc kubenswrapper[4904]: W1205 21:58:58.821210 4904 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ff09a6_b1f4_4794_b656_edfc2f50214a.slice/crio-c01d7bf05527033ea0605e911018bc4be70612e3ff9c91037bd1f1478543ed28 WatchSource:0}: Error finding container c01d7bf05527033ea0605e911018bc4be70612e3ff9c91037bd1f1478543ed28: Status 404 returned error can't find the container with id c01d7bf05527033ea0605e911018bc4be70612e3ff9c91037bd1f1478543ed28 Dec 05 21:58:59 crc kubenswrapper[4904]: I1205 21:58:59.160114 4904 generic.go:334] "Generic (PLEG): container finished" podID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerID="8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb" exitCode=0 Dec 05 21:58:59 crc kubenswrapper[4904]: I1205 21:58:59.160166 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerDied","Data":"8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb"} Dec 05 21:58:59 crc kubenswrapper[4904]: I1205 21:58:59.160224 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerStarted","Data":"c01d7bf05527033ea0605e911018bc4be70612e3ff9c91037bd1f1478543ed28"} Dec 05 21:59:00 crc kubenswrapper[4904]: I1205 21:59:00.172942 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerStarted","Data":"12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac"} Dec 05 21:59:01 crc kubenswrapper[4904]: I1205 21:59:01.193375 4904 generic.go:334] "Generic (PLEG): container finished" podID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerID="12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac" exitCode=0 Dec 05 21:59:01 crc kubenswrapper[4904]: I1205 21:59:01.194244 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerDied","Data":"12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac"} Dec 05 21:59:02 crc kubenswrapper[4904]: I1205 21:59:02.205783 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerStarted","Data":"8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667"} Dec 05 21:59:08 crc kubenswrapper[4904]: I1205 21:59:08.192396 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:59:08 crc kubenswrapper[4904]: I1205 21:59:08.192835 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:59:08 crc kubenswrapper[4904]: I1205 21:59:08.248660 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:59:08 crc kubenswrapper[4904]: I1205 21:59:08.276913 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7wvtk" podStartSLOduration=8.816113394 podStartE2EDuration="11.276884413s" podCreationTimestamp="2025-12-05 21:58:57 +0000 UTC" firstStartedPulling="2025-12-05 21:58:59.161924001 +0000 UTC m=+6437.973140120" lastFinishedPulling="2025-12-05 21:59:01.62269499 +0000 UTC m=+6440.433911139" observedRunningTime="2025-12-05 21:59:02.225422077 +0000 UTC m=+6441.036638216" watchObservedRunningTime="2025-12-05 21:59:08.276884413 +0000 UTC m=+6447.088100522" Dec 05 21:59:08 crc kubenswrapper[4904]: I1205 21:59:08.370353 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:59:08 crc kubenswrapper[4904]: I1205 21:59:08.495182 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7wvtk"] Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.320613 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7wvtk" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="registry-server" containerID="cri-o://8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667" gracePeriod=2 Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.828316 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.941747 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2c9j\" (UniqueName: \"kubernetes.io/projected/76ff09a6-b1f4-4794-b656-edfc2f50214a-kube-api-access-s2c9j\") pod \"76ff09a6-b1f4-4794-b656-edfc2f50214a\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.942218 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-catalog-content\") pod \"76ff09a6-b1f4-4794-b656-edfc2f50214a\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.942412 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-utilities\") pod \"76ff09a6-b1f4-4794-b656-edfc2f50214a\" (UID: \"76ff09a6-b1f4-4794-b656-edfc2f50214a\") " Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.944215 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-utilities" (OuterVolumeSpecName: "utilities") pod "76ff09a6-b1f4-4794-b656-edfc2f50214a" (UID: "76ff09a6-b1f4-4794-b656-edfc2f50214a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:59:10 crc kubenswrapper[4904]: I1205 21:59:10.951531 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ff09a6-b1f4-4794-b656-edfc2f50214a-kube-api-access-s2c9j" (OuterVolumeSpecName: "kube-api-access-s2c9j") pod "76ff09a6-b1f4-4794-b656-edfc2f50214a" (UID: "76ff09a6-b1f4-4794-b656-edfc2f50214a"). InnerVolumeSpecName "kube-api-access-s2c9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.014260 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76ff09a6-b1f4-4794-b656-edfc2f50214a" (UID: "76ff09a6-b1f4-4794-b656-edfc2f50214a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.045354 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2c9j\" (UniqueName: \"kubernetes.io/projected/76ff09a6-b1f4-4794-b656-edfc2f50214a-kube-api-access-s2c9j\") on node \"crc\" DevicePath \"\"" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.045385 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.045396 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ff09a6-b1f4-4794-b656-edfc2f50214a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.336717 4904 generic.go:334] "Generic (PLEG): container finished" podID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerID="8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667" exitCode=0 Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.336848 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wvtk" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.336847 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerDied","Data":"8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667"} Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.337391 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wvtk" event={"ID":"76ff09a6-b1f4-4794-b656-edfc2f50214a","Type":"ContainerDied","Data":"c01d7bf05527033ea0605e911018bc4be70612e3ff9c91037bd1f1478543ed28"} Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.337481 4904 scope.go:117] "RemoveContainer" containerID="8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.372048 4904 scope.go:117] "RemoveContainer" containerID="12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.417671 4904 scope.go:117] "RemoveContainer" containerID="8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.438210 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7wvtk"] Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.449822 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7wvtk"] Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.481666 4904 scope.go:117] "RemoveContainer" containerID="8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667" Dec 05 21:59:11 crc kubenswrapper[4904]: E1205 21:59:11.482359 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667\": container with ID starting with 8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667 not found: ID does not exist" containerID="8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.482412 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667"} err="failed to get container status \"8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667\": rpc error: code = NotFound desc = could not find container \"8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667\": container with ID starting with 8334ca0add96b57b193b8f3cd808bded07276dfe906b56a08bf6f639a6757667 not found: ID does not exist" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.482449 4904 scope.go:117] "RemoveContainer" containerID="12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac" Dec 05 21:59:11 crc kubenswrapper[4904]: E1205 21:59:11.482930 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac\": container with ID starting with 12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac not found: ID does not exist" containerID="12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.482986 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac"} err="failed to get container status \"12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac\": rpc error: code = NotFound desc = could not find container \"12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac\": container with ID starting with 12448d443119edd368cbc9751740f20f03a16ea1aa8811e76229ed4cd85171ac not found: ID does not exist" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.483023 4904 scope.go:117] "RemoveContainer" containerID="8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb" Dec 05 21:59:11 crc kubenswrapper[4904]: E1205 21:59:11.483594 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb\": container with ID starting with 8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb not found: ID does not exist" containerID="8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.483652 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb"} err="failed to get container status \"8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb\": rpc error: code = NotFound desc = could not find container \"8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb\": container with ID starting with 8e31ae5cd07bf549f09c6554c94e39df2927e6185db7abebc2ae07b7c1742ffb not found: ID does not exist" Dec 05 21:59:11 crc kubenswrapper[4904]: I1205 21:59:11.711334 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" path="/var/lib/kubelet/pods/76ff09a6-b1f4-4794-b656-edfc2f50214a/volumes" Dec 05 21:59:29 crc kubenswrapper[4904]: I1205 21:59:29.955812 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:59:29 crc kubenswrapper[4904]: I1205 21:59:29.957867 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:59:59 crc kubenswrapper[4904]: I1205 21:59:59.955375 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:59:59 crc kubenswrapper[4904]: I1205 21:59:59.957256 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.159212 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz"] Dec 05 22:00:00 crc kubenswrapper[4904]: E1205 22:00:00.159684 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="extract-utilities" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.159705 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="extract-utilities" Dec 05 22:00:00 crc kubenswrapper[4904]: E1205 22:00:00.159728 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.159737 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4904]: E1205 22:00:00.159761 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="extract-content" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.159770 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="extract-content" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.160073 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ff09a6-b1f4-4794-b656-edfc2f50214a" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.160882 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.163631 4904 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.163961 4904 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.180900 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz"] Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.236095 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dc04c8a-6ca6-4e05-8465-aa9c48483504-config-volume\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.236156 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zksg2\" (UniqueName: \"kubernetes.io/projected/3dc04c8a-6ca6-4e05-8465-aa9c48483504-kube-api-access-zksg2\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.236193 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dc04c8a-6ca6-4e05-8465-aa9c48483504-secret-volume\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.338324 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dc04c8a-6ca6-4e05-8465-aa9c48483504-config-volume\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.338694 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zksg2\" (UniqueName: \"kubernetes.io/projected/3dc04c8a-6ca6-4e05-8465-aa9c48483504-kube-api-access-zksg2\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.338748 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dc04c8a-6ca6-4e05-8465-aa9c48483504-secret-volume\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.341048 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dc04c8a-6ca6-4e05-8465-aa9c48483504-config-volume\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.344966 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dc04c8a-6ca6-4e05-8465-aa9c48483504-secret-volume\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.360548 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zksg2\" (UniqueName: \"kubernetes.io/projected/3dc04c8a-6ca6-4e05-8465-aa9c48483504-kube-api-access-zksg2\") pod \"collect-profiles-29416200-5grfz\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:00 crc kubenswrapper[4904]: I1205 22:00:00.548200 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:01 crc kubenswrapper[4904]: I1205 22:00:01.044970 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz"] Dec 05 22:00:01 crc kubenswrapper[4904]: I1205 22:00:01.916224 4904 generic.go:334] "Generic (PLEG): container finished" podID="3dc04c8a-6ca6-4e05-8465-aa9c48483504" containerID="9267ef43c2f3eb6b50ae0566d0a027bcb8cb084ac8cdf506bcfaad38b1c705ac" exitCode=0 Dec 05 22:00:01 crc kubenswrapper[4904]: I1205 22:00:01.916297 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" event={"ID":"3dc04c8a-6ca6-4e05-8465-aa9c48483504","Type":"ContainerDied","Data":"9267ef43c2f3eb6b50ae0566d0a027bcb8cb084ac8cdf506bcfaad38b1c705ac"} Dec 05 22:00:01 crc kubenswrapper[4904]: I1205 22:00:01.916547 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" event={"ID":"3dc04c8a-6ca6-4e05-8465-aa9c48483504","Type":"ContainerStarted","Data":"9add76660e99611ba1f29cb9736fe9383b9209184958abdf413b37cd52f61fcd"} Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.295899 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.307325 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dc04c8a-6ca6-4e05-8465-aa9c48483504-config-volume\") pod \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.307421 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dc04c8a-6ca6-4e05-8465-aa9c48483504-secret-volume\") pod \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.307709 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zksg2\" (UniqueName: \"kubernetes.io/projected/3dc04c8a-6ca6-4e05-8465-aa9c48483504-kube-api-access-zksg2\") pod \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\" (UID: \"3dc04c8a-6ca6-4e05-8465-aa9c48483504\") " Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.308018 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc04c8a-6ca6-4e05-8465-aa9c48483504-config-volume" (OuterVolumeSpecName: "config-volume") pod "3dc04c8a-6ca6-4e05-8465-aa9c48483504" (UID: "3dc04c8a-6ca6-4e05-8465-aa9c48483504"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.308407 4904 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dc04c8a-6ca6-4e05-8465-aa9c48483504-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.314118 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc04c8a-6ca6-4e05-8465-aa9c48483504-kube-api-access-zksg2" (OuterVolumeSpecName: "kube-api-access-zksg2") pod "3dc04c8a-6ca6-4e05-8465-aa9c48483504" (UID: "3dc04c8a-6ca6-4e05-8465-aa9c48483504"). InnerVolumeSpecName "kube-api-access-zksg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.315651 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc04c8a-6ca6-4e05-8465-aa9c48483504-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3dc04c8a-6ca6-4e05-8465-aa9c48483504" (UID: "3dc04c8a-6ca6-4e05-8465-aa9c48483504"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.409787 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zksg2\" (UniqueName: \"kubernetes.io/projected/3dc04c8a-6ca6-4e05-8465-aa9c48483504-kube-api-access-zksg2\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.409815 4904 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dc04c8a-6ca6-4e05-8465-aa9c48483504-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:03 crc kubenswrapper[4904]: E1205 22:00:03.760872 4904 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc04c8a_6ca6_4e05_8465_aa9c48483504.slice/crio-9add76660e99611ba1f29cb9736fe9383b9209184958abdf413b37cd52f61fcd\": RecentStats: unable to find data in memory cache]" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.945848 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" event={"ID":"3dc04c8a-6ca6-4e05-8465-aa9c48483504","Type":"ContainerDied","Data":"9add76660e99611ba1f29cb9736fe9383b9209184958abdf413b37cd52f61fcd"} Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.946211 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9add76660e99611ba1f29cb9736fe9383b9209184958abdf413b37cd52f61fcd" Dec 05 22:00:03 crc kubenswrapper[4904]: I1205 22:00:03.946378 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-5grfz" Dec 05 22:00:04 crc kubenswrapper[4904]: I1205 22:00:04.388011 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm"] Dec 05 22:00:04 crc kubenswrapper[4904]: I1205 22:00:04.399416 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-8tvgm"] Dec 05 22:00:05 crc kubenswrapper[4904]: I1205 22:00:05.696930 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2840d019-94a2-4759-b8d5-e8a244032a25" path="/var/lib/kubelet/pods/2840d019-94a2-4759-b8d5-e8a244032a25/volumes" Dec 05 22:00:21 crc kubenswrapper[4904]: I1205 22:00:21.136362 4904 generic.go:334] "Generic (PLEG): container finished" podID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerID="d3d29141d5c3a0b983e659808c69d1ed6bd099050aa09676e18fe95846f2436a" exitCode=0 Dec 05 22:00:21 crc kubenswrapper[4904]: I1205 22:00:21.136448 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72sr6/must-gather-c94bm" event={"ID":"90f2514c-edb1-47b0-8f0f-49952ab8686f","Type":"ContainerDied","Data":"d3d29141d5c3a0b983e659808c69d1ed6bd099050aa09676e18fe95846f2436a"} Dec 05 22:00:21 crc kubenswrapper[4904]: I1205 22:00:21.137759 4904 scope.go:117] "RemoveContainer" containerID="d3d29141d5c3a0b983e659808c69d1ed6bd099050aa09676e18fe95846f2436a" Dec 05 22:00:21 crc kubenswrapper[4904]: I1205 22:00:21.987858 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72sr6_must-gather-c94bm_90f2514c-edb1-47b0-8f0f-49952ab8686f/gather/0.log" Dec 05 22:00:27 crc kubenswrapper[4904]: I1205 22:00:27.809484 4904 scope.go:117] "RemoveContainer" containerID="e7ea9c13048f61335723b52c60180c2780577777665c73afa0781e9a297a144d" Dec 05 22:00:29 crc kubenswrapper[4904]: I1205 22:00:29.955337 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:00:29 crc kubenswrapper[4904]: I1205 22:00:29.956032 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:00:29 crc kubenswrapper[4904]: I1205 22:00:29.956136 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 22:00:29 crc kubenswrapper[4904]: I1205 22:00:29.957474 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2e9673559b3b94a8cab133eb4edbf277b9d48ecf8c2bd1511c1604c1ad703c4"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:00:29 crc kubenswrapper[4904]: I1205 22:00:29.957608 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://d2e9673559b3b94a8cab133eb4edbf277b9d48ecf8c2bd1511c1604c1ad703c4" gracePeriod=600 Dec 05 22:00:30 crc kubenswrapper[4904]: I1205 22:00:30.256854 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="d2e9673559b3b94a8cab133eb4edbf277b9d48ecf8c2bd1511c1604c1ad703c4" exitCode=0 Dec 05 22:00:30 crc kubenswrapper[4904]: I1205 22:00:30.256925 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"d2e9673559b3b94a8cab133eb4edbf277b9d48ecf8c2bd1511c1604c1ad703c4"} Dec 05 22:00:30 crc kubenswrapper[4904]: I1205 22:00:30.256978 4904 scope.go:117] "RemoveContainer" containerID="afe55cb29b17c79ab9c5d0540b6c9a9a3a478a257da4a8d2c668853c76a15473" Dec 05 22:00:31 crc kubenswrapper[4904]: I1205 22:00:31.273441 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerStarted","Data":"9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23"} Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.157296 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72sr6/must-gather-c94bm"] Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.157788 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-72sr6/must-gather-c94bm" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="copy" containerID="cri-o://6d7af8b1bdaa890853ab86e00d49e071b0f5aa45ba12f3e19d70dfa692a55f12" gracePeriod=2 Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.170513 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72sr6/must-gather-c94bm"] Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.312971 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72sr6_must-gather-c94bm_90f2514c-edb1-47b0-8f0f-49952ab8686f/copy/0.log" Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.313459 4904 generic.go:334] "Generic (PLEG): container finished" podID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerID="6d7af8b1bdaa890853ab86e00d49e071b0f5aa45ba12f3e19d70dfa692a55f12" exitCode=143 Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.637611 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72sr6_must-gather-c94bm_90f2514c-edb1-47b0-8f0f-49952ab8686f/copy/0.log" Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.638412 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.735180 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5q98\" (UniqueName: \"kubernetes.io/projected/90f2514c-edb1-47b0-8f0f-49952ab8686f-kube-api-access-q5q98\") pod \"90f2514c-edb1-47b0-8f0f-49952ab8686f\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.735294 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90f2514c-edb1-47b0-8f0f-49952ab8686f-must-gather-output\") pod \"90f2514c-edb1-47b0-8f0f-49952ab8686f\" (UID: \"90f2514c-edb1-47b0-8f0f-49952ab8686f\") " Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.753423 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f2514c-edb1-47b0-8f0f-49952ab8686f-kube-api-access-q5q98" (OuterVolumeSpecName: "kube-api-access-q5q98") pod "90f2514c-edb1-47b0-8f0f-49952ab8686f" (UID: "90f2514c-edb1-47b0-8f0f-49952ab8686f"). InnerVolumeSpecName "kube-api-access-q5q98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.838455 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5q98\" (UniqueName: \"kubernetes.io/projected/90f2514c-edb1-47b0-8f0f-49952ab8686f-kube-api-access-q5q98\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:33 crc kubenswrapper[4904]: I1205 22:00:33.943204 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90f2514c-edb1-47b0-8f0f-49952ab8686f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "90f2514c-edb1-47b0-8f0f-49952ab8686f" (UID: "90f2514c-edb1-47b0-8f0f-49952ab8686f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:00:34 crc kubenswrapper[4904]: I1205 22:00:34.044230 4904 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90f2514c-edb1-47b0-8f0f-49952ab8686f-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:34 crc kubenswrapper[4904]: I1205 22:00:34.322283 4904 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72sr6_must-gather-c94bm_90f2514c-edb1-47b0-8f0f-49952ab8686f/copy/0.log" Dec 05 22:00:34 crc kubenswrapper[4904]: I1205 22:00:34.323042 4904 scope.go:117] "RemoveContainer" containerID="6d7af8b1bdaa890853ab86e00d49e071b0f5aa45ba12f3e19d70dfa692a55f12" Dec 05 22:00:34 crc kubenswrapper[4904]: I1205 22:00:34.323115 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72sr6/must-gather-c94bm" Dec 05 22:00:34 crc kubenswrapper[4904]: I1205 22:00:34.354423 4904 scope.go:117] "RemoveContainer" containerID="d3d29141d5c3a0b983e659808c69d1ed6bd099050aa09676e18fe95846f2436a" Dec 05 22:00:35 crc kubenswrapper[4904]: I1205 22:00:35.697580 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" path="/var/lib/kubelet/pods/90f2514c-edb1-47b0-8f0f-49952ab8686f/volumes" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.178583 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416201-7ktv4"] Dec 05 22:01:00 crc kubenswrapper[4904]: E1205 22:01:00.179555 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="gather" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.179573 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="gather" Dec 05 22:01:00 crc kubenswrapper[4904]: E1205 22:01:00.179592 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc04c8a-6ca6-4e05-8465-aa9c48483504" containerName="collect-profiles" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.179600 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc04c8a-6ca6-4e05-8465-aa9c48483504" containerName="collect-profiles" Dec 05 22:01:00 crc kubenswrapper[4904]: E1205 22:01:00.179634 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="copy" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.179643 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="copy" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.179891 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc04c8a-6ca6-4e05-8465-aa9c48483504" containerName="collect-profiles" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.179916 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="copy" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.179932 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f2514c-edb1-47b0-8f0f-49952ab8686f" containerName="gather" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.180756 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.203684 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416201-7ktv4"] Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.275020 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-config-data\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.275083 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-combined-ca-bundle\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.275158 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-fernet-keys\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.275297 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndcc\" (UniqueName: \"kubernetes.io/projected/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-kube-api-access-bndcc\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.377994 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-fernet-keys\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.378059 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bndcc\" (UniqueName: \"kubernetes.io/projected/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-kube-api-access-bndcc\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.378207 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-config-data\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.378231 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-combined-ca-bundle\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.386631 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-fernet-keys\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.389956 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-config-data\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.417084 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-combined-ca-bundle\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.419758 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndcc\" (UniqueName: \"kubernetes.io/projected/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-kube-api-access-bndcc\") pod \"keystone-cron-29416201-7ktv4\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:00 crc kubenswrapper[4904]: I1205 22:01:00.504095 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:01 crc kubenswrapper[4904]: I1205 22:01:01.039786 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416201-7ktv4"] Dec 05 22:01:01 crc kubenswrapper[4904]: I1205 22:01:01.635882 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416201-7ktv4" event={"ID":"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0","Type":"ContainerStarted","Data":"0c650cfbf30b33ba0df8e2587d6a2ad8f98acd0d29868412775443de91fc94ae"} Dec 05 22:01:01 crc kubenswrapper[4904]: I1205 22:01:01.636151 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416201-7ktv4" event={"ID":"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0","Type":"ContainerStarted","Data":"e3bd4c9dda1e2e04b35796efd0d1fe6e6514d7158ead50ceaa8f0337d13a3cf6"} Dec 05 22:01:01 crc kubenswrapper[4904]: I1205 22:01:01.656604 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416201-7ktv4" podStartSLOduration=1.6564295869999999 podStartE2EDuration="1.656429587s" podCreationTimestamp="2025-12-05 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:01:01.64809166 +0000 UTC m=+6560.459307769" watchObservedRunningTime="2025-12-05 22:01:01.656429587 +0000 UTC m=+6560.467645696" Dec 05 22:01:04 crc kubenswrapper[4904]: I1205 22:01:04.674507 4904 generic.go:334] "Generic (PLEG): container finished" podID="d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" containerID="0c650cfbf30b33ba0df8e2587d6a2ad8f98acd0d29868412775443de91fc94ae" exitCode=0 Dec 05 22:01:04 crc kubenswrapper[4904]: I1205 22:01:04.674570 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416201-7ktv4" event={"ID":"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0","Type":"ContainerDied","Data":"0c650cfbf30b33ba0df8e2587d6a2ad8f98acd0d29868412775443de91fc94ae"} Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.065607 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.130652 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-fernet-keys\") pod \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.130870 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-combined-ca-bundle\") pod \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.130926 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-config-data\") pod \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.131026 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bndcc\" (UniqueName: \"kubernetes.io/projected/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-kube-api-access-bndcc\") pod \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\" (UID: \"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0\") " Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.145206 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" (UID: "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.145347 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-kube-api-access-bndcc" (OuterVolumeSpecName: "kube-api-access-bndcc") pod "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" (UID: "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0"). InnerVolumeSpecName "kube-api-access-bndcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.174876 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" (UID: "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.215330 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-config-data" (OuterVolumeSpecName: "config-data") pod "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" (UID: "d874b1f7-8d4f-4b04-96eb-eebe864a8cb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.234208 4904 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.234242 4904 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.234257 4904 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.234267 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bndcc\" (UniqueName: \"kubernetes.io/projected/d874b1f7-8d4f-4b04-96eb-eebe864a8cb0-kube-api-access-bndcc\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.697358 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416201-7ktv4" event={"ID":"d874b1f7-8d4f-4b04-96eb-eebe864a8cb0","Type":"ContainerDied","Data":"e3bd4c9dda1e2e04b35796efd0d1fe6e6514d7158ead50ceaa8f0337d13a3cf6"} Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.697799 4904 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3bd4c9dda1e2e04b35796efd0d1fe6e6514d7158ead50ceaa8f0337d13a3cf6" Dec 05 22:01:06 crc kubenswrapper[4904]: I1205 22:01:06.697428 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416201-7ktv4" Dec 05 22:01:27 crc kubenswrapper[4904]: I1205 22:01:27.872643 4904 scope.go:117] "RemoveContainer" containerID="b9a2b59e8e32bcc4c3a736e4b5fa7e6fb273fa27fa6e572d776a1334f9a07847" Dec 05 22:01:27 crc kubenswrapper[4904]: I1205 22:01:27.893855 4904 scope.go:117] "RemoveContainer" containerID="fd0bba27d1d1f13db9028dc5e3a91c66e0343695c595354b5627132bdfed5f9b" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.240005 4904 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gm5fk"] Dec 05 22:01:44 crc kubenswrapper[4904]: E1205 22:01:44.241314 4904 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" containerName="keystone-cron" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.241340 4904 state_mem.go:107] "Deleted CPUSet assignment" podUID="d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" containerName="keystone-cron" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.241730 4904 memory_manager.go:354] "RemoveStaleState removing state" podUID="d874b1f7-8d4f-4b04-96eb-eebe864a8cb0" containerName="keystone-cron" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.244574 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.269218 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gm5fk"] Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.354426 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-utilities\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.354675 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-catalog-content\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.354756 4904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phpnp\" (UniqueName: \"kubernetes.io/projected/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-kube-api-access-phpnp\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.456907 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-utilities\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.456989 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-catalog-content\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.457016 4904 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phpnp\" (UniqueName: \"kubernetes.io/projected/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-kube-api-access-phpnp\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.457497 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-utilities\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.457524 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-catalog-content\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.474541 4904 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phpnp\" (UniqueName: \"kubernetes.io/projected/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-kube-api-access-phpnp\") pod \"redhat-operators-gm5fk\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:44 crc kubenswrapper[4904]: I1205 22:01:44.578848 4904 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:45 crc kubenswrapper[4904]: I1205 22:01:45.100836 4904 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gm5fk"] Dec 05 22:01:45 crc kubenswrapper[4904]: I1205 22:01:45.154783 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerStarted","Data":"1d7b79cc5f921b73a9f7f20ff468091ccec8e5990cd40f950f813eb6958b4f03"} Dec 05 22:01:46 crc kubenswrapper[4904]: I1205 22:01:46.170696 4904 generic.go:334] "Generic (PLEG): container finished" podID="5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" containerID="cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2" exitCode=0 Dec 05 22:01:46 crc kubenswrapper[4904]: I1205 22:01:46.170765 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerDied","Data":"cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2"} Dec 05 22:01:47 crc kubenswrapper[4904]: I1205 22:01:47.193654 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerStarted","Data":"c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17"} Dec 05 22:01:49 crc kubenswrapper[4904]: I1205 22:01:49.215743 4904 generic.go:334] "Generic (PLEG): container finished" podID="5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" containerID="c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17" exitCode=0 Dec 05 22:01:49 crc kubenswrapper[4904]: I1205 22:01:49.215867 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerDied","Data":"c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17"} Dec 05 22:01:50 crc kubenswrapper[4904]: I1205 22:01:50.227569 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerStarted","Data":"8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5"} Dec 05 22:01:50 crc kubenswrapper[4904]: I1205 22:01:50.259868 4904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gm5fk" podStartSLOduration=2.815316873 podStartE2EDuration="6.259820944s" podCreationTimestamp="2025-12-05 22:01:44 +0000 UTC" firstStartedPulling="2025-12-05 22:01:46.174471785 +0000 UTC m=+6604.985687924" lastFinishedPulling="2025-12-05 22:01:49.618975836 +0000 UTC m=+6608.430191995" observedRunningTime="2025-12-05 22:01:50.251318593 +0000 UTC m=+6609.062534712" watchObservedRunningTime="2025-12-05 22:01:50.259820944 +0000 UTC m=+6609.071037063" Dec 05 22:01:54 crc kubenswrapper[4904]: I1205 22:01:54.579120 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:54 crc kubenswrapper[4904]: I1205 22:01:54.579574 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:01:55 crc kubenswrapper[4904]: I1205 22:01:55.638024 4904 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gm5fk" podUID="5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" containerName="registry-server" probeResult="failure" output=< Dec 05 22:01:55 crc kubenswrapper[4904]: timeout: failed to connect service ":50051" within 1s Dec 05 22:01:55 crc kubenswrapper[4904]: > Dec 05 22:02:04 crc kubenswrapper[4904]: I1205 22:02:04.628305 4904 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:02:04 crc kubenswrapper[4904]: I1205 22:02:04.707642 4904 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:02:04 crc kubenswrapper[4904]: I1205 22:02:04.895635 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gm5fk"] Dec 05 22:02:06 crc kubenswrapper[4904]: I1205 22:02:06.413813 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gm5fk" podUID="5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" containerName="registry-server" containerID="cri-o://8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5" gracePeriod=2 Dec 05 22:02:06 crc kubenswrapper[4904]: I1205 22:02:06.956265 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.100908 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phpnp\" (UniqueName: \"kubernetes.io/projected/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-kube-api-access-phpnp\") pod \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.100978 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-catalog-content\") pod \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.101001 4904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-utilities\") pod \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\" (UID: \"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad\") " Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.102118 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-utilities" (OuterVolumeSpecName: "utilities") pod "5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" (UID: "5d5d04f0-bdce-4a9b-8a9f-a75d296739ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.120212 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-kube-api-access-phpnp" (OuterVolumeSpecName: "kube-api-access-phpnp") pod "5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" (UID: "5d5d04f0-bdce-4a9b-8a9f-a75d296739ad"). InnerVolumeSpecName "kube-api-access-phpnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.203627 4904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phpnp\" (UniqueName: \"kubernetes.io/projected/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-kube-api-access-phpnp\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.203660 4904 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.210578 4904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" (UID: "5d5d04f0-bdce-4a9b-8a9f-a75d296739ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.305626 4904 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.435878 4904 generic.go:334] "Generic (PLEG): container finished" podID="5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" containerID="8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5" exitCode=0 Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.435946 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerDied","Data":"8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5"} Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.435989 4904 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm5fk" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.436023 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm5fk" event={"ID":"5d5d04f0-bdce-4a9b-8a9f-a75d296739ad","Type":"ContainerDied","Data":"1d7b79cc5f921b73a9f7f20ff468091ccec8e5990cd40f950f813eb6958b4f03"} Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.436096 4904 scope.go:117] "RemoveContainer" containerID="8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.471918 4904 scope.go:117] "RemoveContainer" containerID="c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.478257 4904 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gm5fk"] Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.487211 4904 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gm5fk"] Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.518366 4904 scope.go:117] "RemoveContainer" containerID="cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.560683 4904 scope.go:117] "RemoveContainer" containerID="8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5" Dec 05 22:02:07 crc kubenswrapper[4904]: E1205 22:02:07.561377 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5\": container with ID starting with 8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5 not found: ID does not exist" containerID="8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.561434 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5"} err="failed to get container status \"8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5\": rpc error: code = NotFound desc = could not find container \"8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5\": container with ID starting with 8c5b58297c7a79518417b31c99452759c6c7767761535e57b02dcb039f88efb5 not found: ID does not exist" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.561470 4904 scope.go:117] "RemoveContainer" containerID="c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17" Dec 05 22:02:07 crc kubenswrapper[4904]: E1205 22:02:07.561817 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17\": container with ID starting with c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17 not found: ID does not exist" containerID="c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.561867 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17"} err="failed to get container status \"c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17\": rpc error: code = NotFound desc = could not find container \"c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17\": container with ID starting with c30b3be3fc5c71caa3219a79b2a933ebf2f0f9c3a263f068c5783c24a6d0fe17 not found: ID does not exist" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.561900 4904 scope.go:117] "RemoveContainer" containerID="cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2" Dec 05 22:02:07 crc kubenswrapper[4904]: E1205 22:02:07.562205 4904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2\": container with ID starting with cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2 not found: ID does not exist" containerID="cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.562244 4904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2"} err="failed to get container status \"cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2\": rpc error: code = NotFound desc = could not find container \"cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2\": container with ID starting with cf09936f728294ce709dc23201411e2fb82ccdc6fe0bf930764687da5e02a1e2 not found: ID does not exist" Dec 05 22:02:07 crc kubenswrapper[4904]: I1205 22:02:07.700239 4904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5d04f0-bdce-4a9b-8a9f-a75d296739ad" path="/var/lib/kubelet/pods/5d5d04f0-bdce-4a9b-8a9f-a75d296739ad/volumes" Dec 05 22:02:59 crc kubenswrapper[4904]: I1205 22:02:59.955707 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:02:59 crc kubenswrapper[4904]: I1205 22:02:59.956396 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:03:29 crc kubenswrapper[4904]: I1205 22:03:29.956930 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:03:29 crc kubenswrapper[4904]: I1205 22:03:29.958028 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:03:59 crc kubenswrapper[4904]: I1205 22:03:59.955591 4904 patch_prober.go:28] interesting pod/machine-config-daemon-ffd2h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:03:59 crc kubenswrapper[4904]: I1205 22:03:59.956291 4904 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:03:59 crc kubenswrapper[4904]: I1205 22:03:59.956377 4904 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" Dec 05 22:03:59 crc kubenswrapper[4904]: I1205 22:03:59.957253 4904 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23"} pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:03:59 crc kubenswrapper[4904]: I1205 22:03:59.957353 4904 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" containerName="machine-config-daemon" containerID="cri-o://9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23" gracePeriod=600 Dec 05 22:04:00 crc kubenswrapper[4904]: E1205 22:04:00.083562 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 22:04:00 crc kubenswrapper[4904]: I1205 22:04:00.758335 4904 generic.go:334] "Generic (PLEG): container finished" podID="1cc24b64-e25f-4b55-9123-295388685e7a" containerID="9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23" exitCode=0 Dec 05 22:04:00 crc kubenswrapper[4904]: I1205 22:04:00.758425 4904 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" event={"ID":"1cc24b64-e25f-4b55-9123-295388685e7a","Type":"ContainerDied","Data":"9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23"} Dec 05 22:04:00 crc kubenswrapper[4904]: I1205 22:04:00.758606 4904 scope.go:117] "RemoveContainer" containerID="d2e9673559b3b94a8cab133eb4edbf277b9d48ecf8c2bd1511c1604c1ad703c4" Dec 05 22:04:00 crc kubenswrapper[4904]: I1205 22:04:00.759298 4904 scope.go:117] "RemoveContainer" containerID="9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23" Dec 05 22:04:00 crc kubenswrapper[4904]: E1205 22:04:00.759557 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 22:04:14 crc kubenswrapper[4904]: I1205 22:04:14.682143 4904 scope.go:117] "RemoveContainer" containerID="9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23" Dec 05 22:04:14 crc kubenswrapper[4904]: E1205 22:04:14.683116 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a" Dec 05 22:04:26 crc kubenswrapper[4904]: I1205 22:04:26.681877 4904 scope.go:117] "RemoveContainer" containerID="9a1b23b6ea6d424e1b15bfffcf1565eea5fd8670667b42ea4a1cd0695e685f23" Dec 05 22:04:26 crc kubenswrapper[4904]: E1205 22:04:26.683019 4904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffd2h_openshift-machine-config-operator(1cc24b64-e25f-4b55-9123-295388685e7a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffd2h" podUID="1cc24b64-e25f-4b55-9123-295388685e7a"